Brews, Bubbles, & Booze: Stroke Risk and Patients’ Favorite Drinks

Article Type
Changed

A growing body of research explores the link between stroke risk and regular consumption of coffee, tea, soda, and alcohol. This research roundup reviews the latest findings, highlighting both promising insights and remaining uncertainties to help guide discussions with your patients.

Coffee and Tea: Good or Bad? 

In the INTERSTROKE study, high coffee consumption (> 4 cups daily) was associated with an significantly increased risk for all strokes (odds ratio [OR], 1.37) or ischemic stroke (OR, 1.31), while low to moderate coffee had no link to increased stroke risk. In contrast, tea consumption was associated with lower odds of all stroke (OR, 0.81 for highest intake) or ischemic stroke (OR, 0.81). 

In a recent UK Biobank study, consumption of coffee or tea was associated with reduced risk for stroke and dementia, with the biggest benefit associated with consuming both beverages. 

Specifically, the investigators found that individuals who drank two to three cups of coffee and two to three cups of tea per day had a 30% decrease in incidence of stroke and a 28% lower risk for dementia versus those who did not.

A recent systematic review and dose-response meta-analysis showed that each daily cup increase in tea was associated with an average 4% reduced risk for stroke and a 2% reduced risk for cardiovascular disease (CVD) events. 

The protective effect of coffee and tea on stroke risk may be driven, in part, by flavonoids, which have antioxidant and anti-inflammatory properties, as well as positive effects on vascular function.

“The advice to patients should be that coffee and tea may protect against stroke, but that sweetening either beverage with sugar probably should be minimized,” said Cheryl Bushnell, MD, MHS, of Wake Forest University School of Medicine in Winston-Salem, North Carolina, and chair of the American Stroke Association (ASA) 2024 Guideline for the Primary Prevention of Stroke

Taylor Wallace, PhD, a certified food scientist, said, “most people should consume a cup or two of unsweetened tea per day in moderation for cardiometabolic health. It is an easy step in the right direction for good health but not a cure-all.”

When it comes to coffee, adults who like it should drink it “in moderation — just lay off the cream and sugar,” said Wallace, adjunct associate professor at George Washington University, Washington, DC, and Tufts University, Boston, Massachusetts.

“A cup or two of black coffee with low-fat or nonfat milk with breakfast is a healthy way to start the day, especially when you’re like me and have an 8-year-old that is full of energy!” Wallace said. 
 

The Skinny on Soda

When it comes to sugar-sweetened and diet beverages, data from the Nurses’ Health Study and Health Professionals Follow-Up Study, showed a 16% increased risk for stroke with one or more daily servings of sugar-sweetened or low-calorie soda per day (vs none), independent of established dietary and nondietary cardiovascular risk factors. 

In the Women’s Health Initiative Observational Study of postmenopausal women, a higher intake of artificially sweetened beverages was associated with increased risk for all stroke (adjusted hazard ratio [aHR], 1.23), ischemic stroke (aHR, 1.31), coronary heart disease (aHR, 1.29) and all-cause mortality (aHR, 1.16).

In the Framingham Heart Study Offspring cohort, consumption of one can of diet soda or more each day (vs none) was associated with a nearly threefold increased risk for stroke and dementia over a 10-year follow-up period. 

A separate French study showed that total artificial sweetener intake from all sources was associated with increased overall risk for cardiovascular and cerebrovascular disease.

However, given the limitations of these studies, it’s hard to draw any firm conclusions, Wallace cautioned. 

“We know that sugar-sweetened beverages are correlated with weight gain and cardiometabolic dysfunction promotion in children and adults,” he said. 

Yet, “there really isn’t any convincing evidence that diet soda has much impact on human health at all. Most observational studies are mixed and likely very confounded by other diet and lifestyle factors. That doesn’t mean go overboard; a daily diet soda is probably fine, but that doesn’t mean go drink 10 of them every day,” he added. 
 

 

 

Alcohol: Moderation or Abstinence?

Evidence on alcohol use and stroke risk have been mixed over the years. For decades, the evidence was suggestive that a moderate amount of alcohol daily (one to two drinks in men and one drink in women) may be beneficial at reducing major vascular outcomes.

Yet, over the past few years, some research has found no evidence of benefit with moderate alcohol intake. And the detrimental effects of excessive alcohol use are clear. 

large meta-analysis showed that light to moderate alcohol consumption (up to one drink per day) was associated with a reduced risk for ischemic stroke. However, heavy drinking (more than two drinks per day) significantly increased the risk for both ischemic and hemorrhagic stroke.

A separate study showed young adults who are moderate to heavy drinkers are at increased risk for stroke — and the risk increases with more years of imbibing.

In the INTERSTROKE study, high to moderate alcohol consumption was associated with increased stroke risk, whereas low alcohol consumption conferred no increased risk. 

However, Bushnell pointed out that the study data was derived from based on self-report, and that other healthy behaviors may counteract the risk for alcohol consumption.

“For alcohol, regardless of stroke risk, the most important data shows that any alcohol consumption is associated with worse cognitive function, so generally, the lower the alcohol consumption the better,” Bushnell said. 

She noted that, currently, the American Heart Association (AHA)/ASA recommend a maximum of two drinks per day for men and one drink per day for women to reduce stroke risk.

“However, the data for the risk for cognitive impairment with any alcohol is convincing and should be kept in mind in addition to the maximum alcohol recommended by the AHA/ASA,” Bushnell advised. 

“We know excessive intake puts you at major risk for CVD, cancer, cognitive decline, and a whole host of other health ailments — no question there,” said Wallace.

The impact of moderate intake, on the other hand, is less clear. “Alcohol is a highly biased and political issue and the evidence (or lack thereof) on both sides is shoddy at best,” Wallace added.

A key challenge is that accurate self-reporting of alcohol intake is difficult, even for scientists, and most studies rely on self-reported data from observational cohorts. These often include limited dietary assessments, which provide only a partial picture of long-term consumption patterns, Wallace noted. 

“The short answer is we don’t know if moderation is beneficial, detrimental, or null with respect to health,” he said.

Bushnell reports no relevant disclosures. Wallace (www.drtaylorwallace.com) is CEO of Think Healthy Group; editor of The Journal of Dietary Supplements, deputy editor of The Journal of the American Nutrition Association (www.nutrition.org), nutrition section editor of Annals of Medicine, and an advisory board member with Forbes Health.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

A growing body of research explores the link between stroke risk and regular consumption of coffee, tea, soda, and alcohol. This research roundup reviews the latest findings, highlighting both promising insights and remaining uncertainties to help guide discussions with your patients.

Coffee and Tea: Good or Bad? 

In the INTERSTROKE study, high coffee consumption (> 4 cups daily) was associated with an significantly increased risk for all strokes (odds ratio [OR], 1.37) or ischemic stroke (OR, 1.31), while low to moderate coffee had no link to increased stroke risk. In contrast, tea consumption was associated with lower odds of all stroke (OR, 0.81 for highest intake) or ischemic stroke (OR, 0.81). 

In a recent UK Biobank study, consumption of coffee or tea was associated with reduced risk for stroke and dementia, with the biggest benefit associated with consuming both beverages. 

Specifically, the investigators found that individuals who drank two to three cups of coffee and two to three cups of tea per day had a 30% decrease in incidence of stroke and a 28% lower risk for dementia versus those who did not.

A recent systematic review and dose-response meta-analysis showed that each daily cup increase in tea was associated with an average 4% reduced risk for stroke and a 2% reduced risk for cardiovascular disease (CVD) events. 

The protective effect of coffee and tea on stroke risk may be driven, in part, by flavonoids, which have antioxidant and anti-inflammatory properties, as well as positive effects on vascular function.

“The advice to patients should be that coffee and tea may protect against stroke, but that sweetening either beverage with sugar probably should be minimized,” said Cheryl Bushnell, MD, MHS, of Wake Forest University School of Medicine in Winston-Salem, North Carolina, and chair of the American Stroke Association (ASA) 2024 Guideline for the Primary Prevention of Stroke

Taylor Wallace, PhD, a certified food scientist, said, “most people should consume a cup or two of unsweetened tea per day in moderation for cardiometabolic health. It is an easy step in the right direction for good health but not a cure-all.”

When it comes to coffee, adults who like it should drink it “in moderation — just lay off the cream and sugar,” said Wallace, adjunct associate professor at George Washington University, Washington, DC, and Tufts University, Boston, Massachusetts.

“A cup or two of black coffee with low-fat or nonfat milk with breakfast is a healthy way to start the day, especially when you’re like me and have an 8-year-old that is full of energy!” Wallace said. 
 

The Skinny on Soda

When it comes to sugar-sweetened and diet beverages, data from the Nurses’ Health Study and Health Professionals Follow-Up Study, showed a 16% increased risk for stroke with one or more daily servings of sugar-sweetened or low-calorie soda per day (vs none), independent of established dietary and nondietary cardiovascular risk factors. 

In the Women’s Health Initiative Observational Study of postmenopausal women, a higher intake of artificially sweetened beverages was associated with increased risk for all stroke (adjusted hazard ratio [aHR], 1.23), ischemic stroke (aHR, 1.31), coronary heart disease (aHR, 1.29) and all-cause mortality (aHR, 1.16).

In the Framingham Heart Study Offspring cohort, consumption of one can of diet soda or more each day (vs none) was associated with a nearly threefold increased risk for stroke and dementia over a 10-year follow-up period. 

A separate French study showed that total artificial sweetener intake from all sources was associated with increased overall risk for cardiovascular and cerebrovascular disease.

However, given the limitations of these studies, it’s hard to draw any firm conclusions, Wallace cautioned. 

“We know that sugar-sweetened beverages are correlated with weight gain and cardiometabolic dysfunction promotion in children and adults,” he said. 

Yet, “there really isn’t any convincing evidence that diet soda has much impact on human health at all. Most observational studies are mixed and likely very confounded by other diet and lifestyle factors. That doesn’t mean go overboard; a daily diet soda is probably fine, but that doesn’t mean go drink 10 of them every day,” he added. 
 

 

 

Alcohol: Moderation or Abstinence?

Evidence on alcohol use and stroke risk have been mixed over the years. For decades, the evidence was suggestive that a moderate amount of alcohol daily (one to two drinks in men and one drink in women) may be beneficial at reducing major vascular outcomes.

Yet, over the past few years, some research has found no evidence of benefit with moderate alcohol intake. And the detrimental effects of excessive alcohol use are clear. 

large meta-analysis showed that light to moderate alcohol consumption (up to one drink per day) was associated with a reduced risk for ischemic stroke. However, heavy drinking (more than two drinks per day) significantly increased the risk for both ischemic and hemorrhagic stroke.

A separate study showed young adults who are moderate to heavy drinkers are at increased risk for stroke — and the risk increases with more years of imbibing.

In the INTERSTROKE study, high to moderate alcohol consumption was associated with increased stroke risk, whereas low alcohol consumption conferred no increased risk. 

However, Bushnell pointed out that the study data was derived from based on self-report, and that other healthy behaviors may counteract the risk for alcohol consumption.

“For alcohol, regardless of stroke risk, the most important data shows that any alcohol consumption is associated with worse cognitive function, so generally, the lower the alcohol consumption the better,” Bushnell said. 

She noted that, currently, the American Heart Association (AHA)/ASA recommend a maximum of two drinks per day for men and one drink per day for women to reduce stroke risk.

“However, the data for the risk for cognitive impairment with any alcohol is convincing and should be kept in mind in addition to the maximum alcohol recommended by the AHA/ASA,” Bushnell advised. 

“We know excessive intake puts you at major risk for CVD, cancer, cognitive decline, and a whole host of other health ailments — no question there,” said Wallace.

The impact of moderate intake, on the other hand, is less clear. “Alcohol is a highly biased and political issue and the evidence (or lack thereof) on both sides is shoddy at best,” Wallace added.

A key challenge is that accurate self-reporting of alcohol intake is difficult, even for scientists, and most studies rely on self-reported data from observational cohorts. These often include limited dietary assessments, which provide only a partial picture of long-term consumption patterns, Wallace noted. 

“The short answer is we don’t know if moderation is beneficial, detrimental, or null with respect to health,” he said.

Bushnell reports no relevant disclosures. Wallace (www.drtaylorwallace.com) is CEO of Think Healthy Group; editor of The Journal of Dietary Supplements, deputy editor of The Journal of the American Nutrition Association (www.nutrition.org), nutrition section editor of Annals of Medicine, and an advisory board member with Forbes Health.

A version of this article appeared on Medscape.com.

A growing body of research explores the link between stroke risk and regular consumption of coffee, tea, soda, and alcohol. This research roundup reviews the latest findings, highlighting both promising insights and remaining uncertainties to help guide discussions with your patients.

Coffee and Tea: Good or Bad? 

In the INTERSTROKE study, high coffee consumption (> 4 cups daily) was associated with an significantly increased risk for all strokes (odds ratio [OR], 1.37) or ischemic stroke (OR, 1.31), while low to moderate coffee had no link to increased stroke risk. In contrast, tea consumption was associated with lower odds of all stroke (OR, 0.81 for highest intake) or ischemic stroke (OR, 0.81). 

In a recent UK Biobank study, consumption of coffee or tea was associated with reduced risk for stroke and dementia, with the biggest benefit associated with consuming both beverages. 

Specifically, the investigators found that individuals who drank two to three cups of coffee and two to three cups of tea per day had a 30% decrease in incidence of stroke and a 28% lower risk for dementia versus those who did not.

A recent systematic review and dose-response meta-analysis showed that each daily cup increase in tea was associated with an average 4% reduced risk for stroke and a 2% reduced risk for cardiovascular disease (CVD) events. 

The protective effect of coffee and tea on stroke risk may be driven, in part, by flavonoids, which have antioxidant and anti-inflammatory properties, as well as positive effects on vascular function.

“The advice to patients should be that coffee and tea may protect against stroke, but that sweetening either beverage with sugar probably should be minimized,” said Cheryl Bushnell, MD, MHS, of Wake Forest University School of Medicine in Winston-Salem, North Carolina, and chair of the American Stroke Association (ASA) 2024 Guideline for the Primary Prevention of Stroke

Taylor Wallace, PhD, a certified food scientist, said, “most people should consume a cup or two of unsweetened tea per day in moderation for cardiometabolic health. It is an easy step in the right direction for good health but not a cure-all.”

When it comes to coffee, adults who like it should drink it “in moderation — just lay off the cream and sugar,” said Wallace, adjunct associate professor at George Washington University, Washington, DC, and Tufts University, Boston, Massachusetts.

“A cup or two of black coffee with low-fat or nonfat milk with breakfast is a healthy way to start the day, especially when you’re like me and have an 8-year-old that is full of energy!” Wallace said. 
 

The Skinny on Soda

When it comes to sugar-sweetened and diet beverages, data from the Nurses’ Health Study and Health Professionals Follow-Up Study, showed a 16% increased risk for stroke with one or more daily servings of sugar-sweetened or low-calorie soda per day (vs none), independent of established dietary and nondietary cardiovascular risk factors. 

In the Women’s Health Initiative Observational Study of postmenopausal women, a higher intake of artificially sweetened beverages was associated with increased risk for all stroke (adjusted hazard ratio [aHR], 1.23), ischemic stroke (aHR, 1.31), coronary heart disease (aHR, 1.29) and all-cause mortality (aHR, 1.16).

In the Framingham Heart Study Offspring cohort, consumption of one can of diet soda or more each day (vs none) was associated with a nearly threefold increased risk for stroke and dementia over a 10-year follow-up period. 

A separate French study showed that total artificial sweetener intake from all sources was associated with increased overall risk for cardiovascular and cerebrovascular disease.

However, given the limitations of these studies, it’s hard to draw any firm conclusions, Wallace cautioned. 

“We know that sugar-sweetened beverages are correlated with weight gain and cardiometabolic dysfunction promotion in children and adults,” he said. 

Yet, “there really isn’t any convincing evidence that diet soda has much impact on human health at all. Most observational studies are mixed and likely very confounded by other diet and lifestyle factors. That doesn’t mean go overboard; a daily diet soda is probably fine, but that doesn’t mean go drink 10 of them every day,” he added. 
 

 

 

Alcohol: Moderation or Abstinence?

Evidence on alcohol use and stroke risk have been mixed over the years. For decades, the evidence was suggestive that a moderate amount of alcohol daily (one to two drinks in men and one drink in women) may be beneficial at reducing major vascular outcomes.

Yet, over the past few years, some research has found no evidence of benefit with moderate alcohol intake. And the detrimental effects of excessive alcohol use are clear. 

large meta-analysis showed that light to moderate alcohol consumption (up to one drink per day) was associated with a reduced risk for ischemic stroke. However, heavy drinking (more than two drinks per day) significantly increased the risk for both ischemic and hemorrhagic stroke.

A separate study showed young adults who are moderate to heavy drinkers are at increased risk for stroke — and the risk increases with more years of imbibing.

In the INTERSTROKE study, high to moderate alcohol consumption was associated with increased stroke risk, whereas low alcohol consumption conferred no increased risk. 

However, Bushnell pointed out that the study data was derived from based on self-report, and that other healthy behaviors may counteract the risk for alcohol consumption.

“For alcohol, regardless of stroke risk, the most important data shows that any alcohol consumption is associated with worse cognitive function, so generally, the lower the alcohol consumption the better,” Bushnell said. 

She noted that, currently, the American Heart Association (AHA)/ASA recommend a maximum of two drinks per day for men and one drink per day for women to reduce stroke risk.

“However, the data for the risk for cognitive impairment with any alcohol is convincing and should be kept in mind in addition to the maximum alcohol recommended by the AHA/ASA,” Bushnell advised. 

“We know excessive intake puts you at major risk for CVD, cancer, cognitive decline, and a whole host of other health ailments — no question there,” said Wallace.

The impact of moderate intake, on the other hand, is less clear. “Alcohol is a highly biased and political issue and the evidence (or lack thereof) on both sides is shoddy at best,” Wallace added.

A key challenge is that accurate self-reporting of alcohol intake is difficult, even for scientists, and most studies rely on self-reported data from observational cohorts. These often include limited dietary assessments, which provide only a partial picture of long-term consumption patterns, Wallace noted. 

“The short answer is we don’t know if moderation is beneficial, detrimental, or null with respect to health,” he said.

Bushnell reports no relevant disclosures. Wallace (www.drtaylorwallace.com) is CEO of Think Healthy Group; editor of The Journal of Dietary Supplements, deputy editor of The Journal of the American Nutrition Association (www.nutrition.org), nutrition section editor of Annals of Medicine, and an advisory board member with Forbes Health.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Silent Epidemic: Loneliness a Serious Threat to Both Brain and Body

Article Type
Changed

In a world that is more connected than ever, a silent epidemic is taking its toll. Overall, one in three US adults report chronic loneliness — a condition so detrimental that it rivals smoking and obesity with respect to its negative effect on health and well-being. From anxiety and depression to life-threatening conditions like cardiovascular disease, stroke, and Alzheimer’s and Parkinson’s diseases, loneliness is more than an emotion — it’s a serious threat to both the brain and body.

In 2023, a US Surgeon General advisory raised the alarm about the national problem of loneliness and isolation, describing it as an epidemic.

“Given the significant health consequences of loneliness and isolation, we must prioritize building social connection in the same way we have prioritized other critical public health issues such as tobacco, obesity, and substance use disorders. Together, we can build a country that’s healthier, more resilient, less lonely, and more connected,” the report concluded.

But how, exactly, does chronic loneliness affect the physiology and function of the brain? What does the latest research reveal about the link between loneliness and neurologic and psychiatric illness, and what can clinicians do to address the issue?

This news organization spoke to multiple experts in the field to explore these issues.
 

A Major Risk Factor

Anna Finley, PhD, assistant professor of psychology at North Dakota State University, Fargo, explained that loneliness and social isolation are different entities. Social isolation is an objective measure of the number of people someone interacts with on a regular basis, whereas loneliness is a subjective feeling that occurs when close connections are lacking.

“These two things are not actually as related as you think they would be. People can feel lonely in a crowd or feel well connected with only a few friendships. It’s more about the quality of the connection and the quality of your perception of it. So someone could be in some very supportive relationships but still feel that there’s something missing,” she said in an interview.

So what do we know about how loneliness affects health? Evidence supporting the hypothesis that loneliness is an emerging risk factor for many diseases is steadily building.

Recently, the American Heart Association published a statement summarizing the evidence for a direct association between social isolation and loneliness and coronary heart disease and stroke mortality.

In addition, many studies have shown that individuals experiencing social isolation or loneliness have an increased risk for anxiety and depression, dementia, infectious disease, hospitalization, and all-cause death, even after adjusting for age and many other traditional risk factors.

One study revealed that eliminating loneliness has the potential to prevent nearly 20% of cases of depression in adults aged 50 years or older.

Indu Subramanian, MD, professor of neurology at the University of California, Los Angeles, and colleagues conducted a study involving patients with Parkinson’s disease, which showed that the negative impact of loneliness on disease severity was as significant as the positive effects of 30 minutes of daily exercise.

“The importance of loneliness is under-recognized and undervalued, and it poses a major risk for health outcomes and quality of life,” said Subramanian.

Subramanian noted that loneliness is stigmatizing, causing people to feel unlikable and blame themselves, which prevents them from opening up to doctors or loved ones about their struggle. At the same time, healthcare providers may not think to ask about loneliness or know about potential interventions. She emphasized that much more work is needed to address this issue.
 

 

 

Early Mortality Risk

Julianne Holt-Lunstad, PhD, professor of psychology and neuroscience at Brigham Young University in Provo, Utah, is the author of two large meta-analyses that suggest loneliness, social isolation, or living alone are independent risk factors for early mortality, increasing this risk by about a third — the equivalent to the risk of smoking 15 cigarettes per day.

“We have quite robust evidence across a number of health outcomes implicating the harmful effects of loneliness and social isolation. While these are observational studies and show mainly associations, we do have evidence from longitudinal studies that show lacking social connection, whether that be loneliness or social isolation, predicts subsequent worse outcomes, and most of these studies have adjusted for alternative kinds of explanations, like age, initial health status, lifestyle factors,” Holt-Lunstad said.

There is some evidence to suggest that isolation is more predictive of physical health outcomes, whereas loneliness is more predictive of mental health outcomes. That said, both isolation and loneliness have significant effects on mental and physical health outcomes, she noted.

There is also the question of whether loneliness is causing poor health or whether people who are in poor health feel lonely because poor health can lead to social isolation.

Finley said there’s probably a bit of both going on, but longitudinal studies, where loneliness is measured at a fixed timepoint then health outcomes are reported a few years later, suggest that loneliness is contributing to these adverse outcomes.

She added that there is also some evidence in animal models to suggest that loneliness is a causal risk factor for adverse health outcomes. “But you can’t ask a mouse or rat how lonely they’re feeling. All you can do is house them individually — removing them from social connection. This isn’t necessarily the same thing as loneliness in humans.”

Finley is studying mechanisms in the brain that may be involved in mediating the adverse health consequences of loneliness.

“What I’ve been seeing in the data so far is that it tends to be the self-report of how lonely folks are feeling that has the associations with differences in the brain, as opposed to the number of social connections people have. It does seem to be the more subjective, emotional perception of loneliness that is important.”

In a review of potential mechanisms involved, she concluded that it is dysregulated emotions and altered perceptions of social interactions that has profound impacts on the brain, suggesting that people who are lonely may have a tendency to interpret social cues in a negative way, preventing them from forming productive positive relationships.
 

Lack of Trust

One researcher who has studied this phenomenon is Dirk Scheele, PhD, professor of social neuroscience at Ruhr University Bochum in Germany.

“We were interested to find out why people remained lonely,” he said in an interview. “Loneliness is an unpleasant experience, and there are so many opportunities for social contacts nowadays, it’s not really clear at first sight why people are chronically lonely.”

To examine this question, Scheele and his team conducted a study in which functional MRI was used to examine the brain in otherwise healthy individuals with high or low loneliness scores while they played a trust game.

They also simulated a positive social interaction between participants and researchers, in which they talked about plans for a fictitious lottery win, and about their hobbies and interests, during which mood was measured with questionnaires, and saliva samples were collected to measure hormone levels.

Results showed that the high-lonely individuals had reduced activation in the insula cortex during the trust decisions. “This area of the brain is involved in the processing of bodily signals, such as ‘gut feelings.’ So reduced activity here could be interpreted as fewer gut feelings on who can be trusted,” Scheele explained.

The high-lonely individuals also had reduced responsiveness to the positive social interaction with a lower release of oxytocin and a smaller elevation in mood compared with the control individuals.

Scheele pointed out that there is some evidence that oxytocin might increase trust, and there is reduced release of endogenous oxytocin in high loneliness.

“Our results are consistent with the idea that loneliness is associated with negative biases about other people. So if we expect negative things from other people — for instance, that they cannot be trusted — then that would hamper further social interactions and could lead to loneliness,” he added.
 

 

 

A Role for Oxytocin?

In another study, the same researchers tested short-term (five weekly sessions) group psychotherapy to reduce loneliness using established techniques to target these negative biases. They also investigated whether the effects of this group psychotherapy could be augmented by administering intranasal oxytocin (vs placebo) before the group psychotherapy sessions.

Results showed that the group psychotherapy intervention reduced trait loneliness (loneliness experienced over a prolonged period). The oxytocin did not show a significant effect on trait loneliness, but there was a suggestion that it may enhance the reduction in state loneliness (how someone is feeling at a specific time) brought about by the psychotherapy sessions.

“We found that bonding within the groups was experienced as more positive in the oxytocin treated groups. It is possible that a longer intervention would be helpful for longer-term results,” Scheele concluded. “It’s not going to be a quick fix for loneliness, but there may be a role for oxytocin as an adjunct to psychotherapy.”
 

A Basic Human Need

Another loneliness researcher, Livia Tomova, PhD, assistant professor of psychology at Cardiff University in Wales, has used social isolation to induce loneliness in young people and found that this intervention was linked to brain patterns similar to those associated with hunger.

“We know that the drive to eat food is a very basic human need. We know quite well how it is represented in the brain,” she explained.

The researchers tested how the brains of the participants responded to seeing pictures of social interactions after they underwent a prolonged period of social isolation. In a subsequent session, the same people were asked to undergo food fasting and then underwent brain scans when looking at pictures of food. Results showed that the neural patterns were similar in the two situations with increased activity in the substantia nigra area within the midbrain.

“This area of the brain processes rewards and motivation. It consists primarily of dopamine neurons and increased activity corresponds to a feeling of craving something. So this area of the brain that controls essential homeostatic needs is activated when people feel lonely, suggesting that our need for social contact with others is potentially a very basic need similar to eating,” Tomova said.
 

Lower Gray Matter Volumes in Key Brain Areas

And another group from Germany has found that higher loneliness scores are negatively associated with specific brain regions responsible for memory, emotion regulation, and social processing.

Sandra Düzel, PhD, and colleagues from the Max Planck Institute for Human Development and the Charité – Universitätsmedizin Berlin, both in Berlin, Germany, reported a study in which individuals who reported higher loneliness had smaller gray matter volumes in brain regions such as the left amygdala, anterior hippocampus, and cerebellum, regions which are crucial for both emotional regulation and higher-order cognitive processes, such as self-reflection and executive function.

Düzel believes that possible mechanisms behind the link between loneliness and brain volume differences could include stress-related damage, with prolonged loneliness associated with elevated levels of stress hormones, which can damage the hippocampus over time, and reduced cognitive and social stimulation, which may contribute to brain volume reductions in regions critical for memory and emotional processing.

“Loneliness is often characterized by reduced social and environmental diversity, leading to less engagement with novel experiences and potentially lower hippocampal-striatal connectivity.

Since novelty-seeking and environmental diversity are associated with positive emotional states, individuals experiencing loneliness might benefit from increased exposure to new environments which could stimulate the brain’s reward circuits, fostering positive affect and potentially mitigating the emotional burden of loneliness,” she said.
 

 

 

Is Social Prescribing the Answer?

So are there enough data now to act and attempt to develop interventions to reduce loneliness? Most of these researchers believe so.

“I think we have enough information to act on this now. There are a number of national academies consensus reports, which suggest that, while certainly there are still gaps in our evidence and more to be learned, there is sufficient evidence that a concerning portion of the population seems to lack connection, and that the consequences are serious enough that we need to do something about it,” said Holt-Lunstad.

Some countries have introduced social prescribing where doctors can prescribe a group activity or a regular visit or telephone conversation with a supportive person.

Subramanian pointed out that it’s easier to implement in countries with national health services and may be more difficult to embrace in the US healthcare system.

“We are not so encouraged from a financial perspective to think about preventive care in the US. We don’t have an easy way to recognize in any tangible way the downstream of such activities in terms of preventing future problems. That is something we need to work on,” she said.

Finley cautioned that to work well, social prescribing will require an understanding of each person’s individual situation.

“Some people may only receive benefit of interacting with others if they are also getting some sort of support to address the social and emotional concerns that are tagging along with loneliness. I’m not sure that just telling people to go join their local gardening club or whatever will be the correct answer for everyone.”

She pointed out that many people will have issues in their life that are making it hard for them to be social. These could be mobility or financial challenges, care responsibilities, or concerns about illnesses or life events. “We need to figure out what would have the most bang for the person’s buck, so to speak, as an intervention. That could mean connecting them to a group relevant to their individual situation.”
 

Opportunity to Connect Not Enough?

Tomova believes that training people in social skills may be a better option. “It appears that some people who are chronically lonely seem to struggle to make relationships with others. So just encouraging them to interact with others more will not necessarily help. We need to better understand the pathways involved and who are the people who become ill. We can then develop and target better interventions and teach people coping strategies for that situation.”

Scheele agreed. “While just giving people the opportunity to connect may work for some, others who are experiencing really chronic loneliness may not benefit very much from this unless their negative belief systems are addressed.” He suggested some sort of psychotherapy may be helpful in this situation.

But at least all seem to agree that healthcare providers need to be more aware of loneliness as a health risk factor, try to identify people at risk, and to think about how best to support them.

Holt-Lunstad noted that one of the recommendations in the US Surgeon General’s advisory was to increase the education, training, and resources on loneliness for healthcare providers.

“If we want this to be addressed, we need to give healthcare providers the time, resources, and training in order to do that, otherwise, we are adding one more thing to an already overburdened system. They need to understand how important it is, and how it might help them take care of the patient.”

“Our hope is that we can start to reverse some of the trends that we are seeing, both in terms of the prevalence rates of loneliness, but also that we could start seeing improvements in health and other kinds of outcomes,” she concluded.

Progress is being made in increasing awareness about the dangers of chronic loneliness. It’s now recognized as a serious health risk, but there are actionable steps that can help. Loneliness doesn’t have to be a permanent condition for anyone, said Scheele.

Holt-Lunstad served as an adviser for Foundation for Social Connection, Global Initiative on Loneliness and Connection, and Nextdoor Neighborhood Vitality Board and received research grants/income from Templeton Foundation, Eventbrite, Foundation for Social Connection, and Triple-S Foundation. Subramanian served as a speaker bureau for Acorda Pharma. The other researchers reported no disclosures.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

In a world that is more connected than ever, a silent epidemic is taking its toll. Overall, one in three US adults report chronic loneliness — a condition so detrimental that it rivals smoking and obesity with respect to its negative effect on health and well-being. From anxiety and depression to life-threatening conditions like cardiovascular disease, stroke, and Alzheimer’s and Parkinson’s diseases, loneliness is more than an emotion — it’s a serious threat to both the brain and body.

In 2023, a US Surgeon General advisory raised the alarm about the national problem of loneliness and isolation, describing it as an epidemic.

“Given the significant health consequences of loneliness and isolation, we must prioritize building social connection in the same way we have prioritized other critical public health issues such as tobacco, obesity, and substance use disorders. Together, we can build a country that’s healthier, more resilient, less lonely, and more connected,” the report concluded.

But how, exactly, does chronic loneliness affect the physiology and function of the brain? What does the latest research reveal about the link between loneliness and neurologic and psychiatric illness, and what can clinicians do to address the issue?

This news organization spoke to multiple experts in the field to explore these issues.
 

A Major Risk Factor

Anna Finley, PhD, assistant professor of psychology at North Dakota State University, Fargo, explained that loneliness and social isolation are different entities. Social isolation is an objective measure of the number of people someone interacts with on a regular basis, whereas loneliness is a subjective feeling that occurs when close connections are lacking.

“These two things are not actually as related as you think they would be. People can feel lonely in a crowd or feel well connected with only a few friendships. It’s more about the quality of the connection and the quality of your perception of it. So someone could be in some very supportive relationships but still feel that there’s something missing,” she said in an interview.

So what do we know about how loneliness affects health? Evidence supporting the hypothesis that loneliness is an emerging risk factor for many diseases is steadily building.

Recently, the American Heart Association published a statement summarizing the evidence for a direct association between social isolation and loneliness and coronary heart disease and stroke mortality.

In addition, many studies have shown that individuals experiencing social isolation or loneliness have an increased risk for anxiety and depression, dementia, infectious disease, hospitalization, and all-cause death, even after adjusting for age and many other traditional risk factors.

One study revealed that eliminating loneliness has the potential to prevent nearly 20% of cases of depression in adults aged 50 years or older.

Indu Subramanian, MD, professor of neurology at the University of California, Los Angeles, and colleagues conducted a study involving patients with Parkinson’s disease, which showed that the negative impact of loneliness on disease severity was as significant as the positive effects of 30 minutes of daily exercise.

“The importance of loneliness is under-recognized and undervalued, and it poses a major risk for health outcomes and quality of life,” said Subramanian.

Subramanian noted that loneliness is stigmatizing, causing people to feel unlikable and blame themselves, which prevents them from opening up to doctors or loved ones about their struggle. At the same time, healthcare providers may not think to ask about loneliness or know about potential interventions. She emphasized that much more work is needed to address this issue.
 

 

 

Early Mortality Risk

Julianne Holt-Lunstad, PhD, professor of psychology and neuroscience at Brigham Young University in Provo, Utah, is the author of two large meta-analyses that suggest loneliness, social isolation, or living alone are independent risk factors for early mortality, increasing this risk by about a third — the equivalent to the risk of smoking 15 cigarettes per day.

“We have quite robust evidence across a number of health outcomes implicating the harmful effects of loneliness and social isolation. While these are observational studies and show mainly associations, we do have evidence from longitudinal studies that show lacking social connection, whether that be loneliness or social isolation, predicts subsequent worse outcomes, and most of these studies have adjusted for alternative kinds of explanations, like age, initial health status, lifestyle factors,” Holt-Lunstad said.

There is some evidence to suggest that isolation is more predictive of physical health outcomes, whereas loneliness is more predictive of mental health outcomes. That said, both isolation and loneliness have significant effects on mental and physical health outcomes, she noted.

There is also the question of whether loneliness is causing poor health or whether people who are in poor health feel lonely because poor health can lead to social isolation.

Finley said there’s probably a bit of both going on, but longitudinal studies, where loneliness is measured at a fixed timepoint then health outcomes are reported a few years later, suggest that loneliness is contributing to these adverse outcomes.

She added that there is also some evidence in animal models to suggest that loneliness is a causal risk factor for adverse health outcomes. “But you can’t ask a mouse or rat how lonely they’re feeling. All you can do is house them individually — removing them from social connection. This isn’t necessarily the same thing as loneliness in humans.”

Finley is studying mechanisms in the brain that may be involved in mediating the adverse health consequences of loneliness.

“What I’ve been seeing in the data so far is that it tends to be the self-report of how lonely folks are feeling that has the associations with differences in the brain, as opposed to the number of social connections people have. It does seem to be the more subjective, emotional perception of loneliness that is important.”

In a review of potential mechanisms involved, she concluded that it is dysregulated emotions and altered perceptions of social interactions that has profound impacts on the brain, suggesting that people who are lonely may have a tendency to interpret social cues in a negative way, preventing them from forming productive positive relationships.
 

Lack of Trust

One researcher who has studied this phenomenon is Dirk Scheele, PhD, professor of social neuroscience at Ruhr University Bochum in Germany.

“We were interested to find out why people remained lonely,” he said in an interview. “Loneliness is an unpleasant experience, and there are so many opportunities for social contacts nowadays, it’s not really clear at first sight why people are chronically lonely.”

To examine this question, Scheele and his team conducted a study in which functional MRI was used to examine the brain in otherwise healthy individuals with high or low loneliness scores while they played a trust game.

They also simulated a positive social interaction between participants and researchers, in which they talked about plans for a fictitious lottery win, and about their hobbies and interests, during which mood was measured with questionnaires, and saliva samples were collected to measure hormone levels.

Results showed that the high-lonely individuals had reduced activation in the insula cortex during the trust decisions. “This area of the brain is involved in the processing of bodily signals, such as ‘gut feelings.’ So reduced activity here could be interpreted as fewer gut feelings on who can be trusted,” Scheele explained.

The high-lonely individuals also had reduced responsiveness to the positive social interaction with a lower release of oxytocin and a smaller elevation in mood compared with the control individuals.

Scheele pointed out that there is some evidence that oxytocin might increase trust, and there is reduced release of endogenous oxytocin in high loneliness.

“Our results are consistent with the idea that loneliness is associated with negative biases about other people. So if we expect negative things from other people — for instance, that they cannot be trusted — then that would hamper further social interactions and could lead to loneliness,” he added.
 

 

 

A Role for Oxytocin?

In another study, the same researchers tested short-term (five weekly sessions) group psychotherapy to reduce loneliness using established techniques to target these negative biases. They also investigated whether the effects of this group psychotherapy could be augmented by administering intranasal oxytocin (vs placebo) before the group psychotherapy sessions.

Results showed that the group psychotherapy intervention reduced trait loneliness (loneliness experienced over a prolonged period). The oxytocin did not show a significant effect on trait loneliness, but there was a suggestion that it may enhance the reduction in state loneliness (how someone is feeling at a specific time) brought about by the psychotherapy sessions.

“We found that bonding within the groups was experienced as more positive in the oxytocin treated groups. It is possible that a longer intervention would be helpful for longer-term results,” Scheele concluded. “It’s not going to be a quick fix for loneliness, but there may be a role for oxytocin as an adjunct to psychotherapy.”
 

A Basic Human Need

Another loneliness researcher, Livia Tomova, PhD, assistant professor of psychology at Cardiff University in Wales, has used social isolation to induce loneliness in young people and found that this intervention was linked to brain patterns similar to those associated with hunger.

“We know that the drive to eat food is a very basic human need. We know quite well how it is represented in the brain,” she explained.

The researchers tested how the brains of the participants responded to seeing pictures of social interactions after they underwent a prolonged period of social isolation. In a subsequent session, the same people were asked to undergo food fasting and then underwent brain scans when looking at pictures of food. Results showed that the neural patterns were similar in the two situations with increased activity in the substantia nigra area within the midbrain.

“This area of the brain processes rewards and motivation. It consists primarily of dopamine neurons and increased activity corresponds to a feeling of craving something. So this area of the brain that controls essential homeostatic needs is activated when people feel lonely, suggesting that our need for social contact with others is potentially a very basic need similar to eating,” Tomova said.
 

Lower Gray Matter Volumes in Key Brain Areas

And another group from Germany has found that higher loneliness scores are negatively associated with specific brain regions responsible for memory, emotion regulation, and social processing.

Sandra Düzel, PhD, and colleagues from the Max Planck Institute for Human Development and the Charité – Universitätsmedizin Berlin, both in Berlin, Germany, reported a study in which individuals who reported higher loneliness had smaller gray matter volumes in brain regions such as the left amygdala, anterior hippocampus, and cerebellum, regions which are crucial for both emotional regulation and higher-order cognitive processes, such as self-reflection and executive function.

Düzel believes that possible mechanisms behind the link between loneliness and brain volume differences could include stress-related damage, with prolonged loneliness associated with elevated levels of stress hormones, which can damage the hippocampus over time, and reduced cognitive and social stimulation, which may contribute to brain volume reductions in regions critical for memory and emotional processing.

“Loneliness is often characterized by reduced social and environmental diversity, leading to less engagement with novel experiences and potentially lower hippocampal-striatal connectivity.

Since novelty-seeking and environmental diversity are associated with positive emotional states, individuals experiencing loneliness might benefit from increased exposure to new environments which could stimulate the brain’s reward circuits, fostering positive affect and potentially mitigating the emotional burden of loneliness,” she said.
 

 

 

Is Social Prescribing the Answer?

So are there enough data now to act and attempt to develop interventions to reduce loneliness? Most of these researchers believe so.

“I think we have enough information to act on this now. There are a number of national academies consensus reports, which suggest that, while certainly there are still gaps in our evidence and more to be learned, there is sufficient evidence that a concerning portion of the population seems to lack connection, and that the consequences are serious enough that we need to do something about it,” said Holt-Lunstad.

Some countries have introduced social prescribing where doctors can prescribe a group activity or a regular visit or telephone conversation with a supportive person.

Subramanian pointed out that it’s easier to implement in countries with national health services and may be more difficult to embrace in the US healthcare system.

“We are not so encouraged from a financial perspective to think about preventive care in the US. We don’t have an easy way to recognize in any tangible way the downstream of such activities in terms of preventing future problems. That is something we need to work on,” she said.

Finley cautioned that to work well, social prescribing will require an understanding of each person’s individual situation.

“Some people may only receive benefit of interacting with others if they are also getting some sort of support to address the social and emotional concerns that are tagging along with loneliness. I’m not sure that just telling people to go join their local gardening club or whatever will be the correct answer for everyone.”

She pointed out that many people will have issues in their life that are making it hard for them to be social. These could be mobility or financial challenges, care responsibilities, or concerns about illnesses or life events. “We need to figure out what would have the most bang for the person’s buck, so to speak, as an intervention. That could mean connecting them to a group relevant to their individual situation.”
 

Opportunity to Connect Not Enough?

Tomova believes that training people in social skills may be a better option. “It appears that some people who are chronically lonely seem to struggle to make relationships with others. So just encouraging them to interact with others more will not necessarily help. We need to better understand the pathways involved and who are the people who become ill. We can then develop and target better interventions and teach people coping strategies for that situation.”

Scheele agreed. “While just giving people the opportunity to connect may work for some, others who are experiencing really chronic loneliness may not benefit very much from this unless their negative belief systems are addressed.” He suggested some sort of psychotherapy may be helpful in this situation.

But at least all seem to agree that healthcare providers need to be more aware of loneliness as a health risk factor, try to identify people at risk, and to think about how best to support them.

Holt-Lunstad noted that one of the recommendations in the US Surgeon General’s advisory was to increase the education, training, and resources on loneliness for healthcare providers.

“If we want this to be addressed, we need to give healthcare providers the time, resources, and training in order to do that, otherwise, we are adding one more thing to an already overburdened system. They need to understand how important it is, and how it might help them take care of the patient.”

“Our hope is that we can start to reverse some of the trends that we are seeing, both in terms of the prevalence rates of loneliness, but also that we could start seeing improvements in health and other kinds of outcomes,” she concluded.

Progress is being made in increasing awareness about the dangers of chronic loneliness. It’s now recognized as a serious health risk, but there are actionable steps that can help. Loneliness doesn’t have to be a permanent condition for anyone, said Scheele.

Holt-Lunstad served as an adviser for Foundation for Social Connection, Global Initiative on Loneliness and Connection, and Nextdoor Neighborhood Vitality Board and received research grants/income from Templeton Foundation, Eventbrite, Foundation for Social Connection, and Triple-S Foundation. Subramanian served as a speaker bureau for Acorda Pharma. The other researchers reported no disclosures.

A version of this article first appeared on Medscape.com.

In a world that is more connected than ever, a silent epidemic is taking its toll. Overall, one in three US adults report chronic loneliness — a condition so detrimental that it rivals smoking and obesity with respect to its negative effect on health and well-being. From anxiety and depression to life-threatening conditions like cardiovascular disease, stroke, and Alzheimer’s and Parkinson’s diseases, loneliness is more than an emotion — it’s a serious threat to both the brain and body.

In 2023, a US Surgeon General advisory raised the alarm about the national problem of loneliness and isolation, describing it as an epidemic.

“Given the significant health consequences of loneliness and isolation, we must prioritize building social connection in the same way we have prioritized other critical public health issues such as tobacco, obesity, and substance use disorders. Together, we can build a country that’s healthier, more resilient, less lonely, and more connected,” the report concluded.

But how, exactly, does chronic loneliness affect the physiology and function of the brain? What does the latest research reveal about the link between loneliness and neurologic and psychiatric illness, and what can clinicians do to address the issue?

This news organization spoke to multiple experts in the field to explore these issues.
 

A Major Risk Factor

Anna Finley, PhD, assistant professor of psychology at North Dakota State University, Fargo, explained that loneliness and social isolation are different entities. Social isolation is an objective measure of the number of people someone interacts with on a regular basis, whereas loneliness is a subjective feeling that occurs when close connections are lacking.

“These two things are not actually as related as you think they would be. People can feel lonely in a crowd or feel well connected with only a few friendships. It’s more about the quality of the connection and the quality of your perception of it. So someone could be in some very supportive relationships but still feel that there’s something missing,” she said in an interview.

So what do we know about how loneliness affects health? Evidence supporting the hypothesis that loneliness is an emerging risk factor for many diseases is steadily building.

Recently, the American Heart Association published a statement summarizing the evidence for a direct association between social isolation and loneliness and coronary heart disease and stroke mortality.

In addition, many studies have shown that individuals experiencing social isolation or loneliness have an increased risk for anxiety and depression, dementia, infectious disease, hospitalization, and all-cause death, even after adjusting for age and many other traditional risk factors.

One study revealed that eliminating loneliness has the potential to prevent nearly 20% of cases of depression in adults aged 50 years or older.

Indu Subramanian, MD, professor of neurology at the University of California, Los Angeles, and colleagues conducted a study involving patients with Parkinson’s disease, which showed that the negative impact of loneliness on disease severity was as significant as the positive effects of 30 minutes of daily exercise.

“The importance of loneliness is under-recognized and undervalued, and it poses a major risk for health outcomes and quality of life,” said Subramanian.

Subramanian noted that loneliness is stigmatizing, causing people to feel unlikable and blame themselves, which prevents them from opening up to doctors or loved ones about their struggle. At the same time, healthcare providers may not think to ask about loneliness or know about potential interventions. She emphasized that much more work is needed to address this issue.
 

 

 

Early Mortality Risk

Julianne Holt-Lunstad, PhD, professor of psychology and neuroscience at Brigham Young University in Provo, Utah, is the author of two large meta-analyses that suggest loneliness, social isolation, or living alone are independent risk factors for early mortality, increasing this risk by about a third — the equivalent to the risk of smoking 15 cigarettes per day.

“We have quite robust evidence across a number of health outcomes implicating the harmful effects of loneliness and social isolation. While these are observational studies and show mainly associations, we do have evidence from longitudinal studies that show lacking social connection, whether that be loneliness or social isolation, predicts subsequent worse outcomes, and most of these studies have adjusted for alternative kinds of explanations, like age, initial health status, lifestyle factors,” Holt-Lunstad said.

There is some evidence to suggest that isolation is more predictive of physical health outcomes, whereas loneliness is more predictive of mental health outcomes. That said, both isolation and loneliness have significant effects on mental and physical health outcomes, she noted.

There is also the question of whether loneliness is causing poor health or whether people who are in poor health feel lonely because poor health can lead to social isolation.

Finley said there’s probably a bit of both going on, but longitudinal studies, where loneliness is measured at a fixed timepoint then health outcomes are reported a few years later, suggest that loneliness is contributing to these adverse outcomes.

She added that there is also some evidence in animal models to suggest that loneliness is a causal risk factor for adverse health outcomes. “But you can’t ask a mouse or rat how lonely they’re feeling. All you can do is house them individually — removing them from social connection. This isn’t necessarily the same thing as loneliness in humans.”

Finley is studying mechanisms in the brain that may be involved in mediating the adverse health consequences of loneliness.

“What I’ve been seeing in the data so far is that it tends to be the self-report of how lonely folks are feeling that has the associations with differences in the brain, as opposed to the number of social connections people have. It does seem to be the more subjective, emotional perception of loneliness that is important.”

In a review of potential mechanisms involved, she concluded that it is dysregulated emotions and altered perceptions of social interactions that has profound impacts on the brain, suggesting that people who are lonely may have a tendency to interpret social cues in a negative way, preventing them from forming productive positive relationships.
 

Lack of Trust

One researcher who has studied this phenomenon is Dirk Scheele, PhD, professor of social neuroscience at Ruhr University Bochum in Germany.

“We were interested to find out why people remained lonely,” he said in an interview. “Loneliness is an unpleasant experience, and there are so many opportunities for social contacts nowadays, it’s not really clear at first sight why people are chronically lonely.”

To examine this question, Scheele and his team conducted a study in which functional MRI was used to examine the brain in otherwise healthy individuals with high or low loneliness scores while they played a trust game.

They also simulated a positive social interaction between participants and researchers, in which they talked about plans for a fictitious lottery win, and about their hobbies and interests, during which mood was measured with questionnaires, and saliva samples were collected to measure hormone levels.

Results showed that the high-lonely individuals had reduced activation in the insula cortex during the trust decisions. “This area of the brain is involved in the processing of bodily signals, such as ‘gut feelings.’ So reduced activity here could be interpreted as fewer gut feelings on who can be trusted,” Scheele explained.

The high-lonely individuals also had reduced responsiveness to the positive social interaction with a lower release of oxytocin and a smaller elevation in mood compared with the control individuals.

Scheele pointed out that there is some evidence that oxytocin might increase trust, and there is reduced release of endogenous oxytocin in high loneliness.

“Our results are consistent with the idea that loneliness is associated with negative biases about other people. So if we expect negative things from other people — for instance, that they cannot be trusted — then that would hamper further social interactions and could lead to loneliness,” he added.
 

 

 

A Role for Oxytocin?

In another study, the same researchers tested short-term (five weekly sessions) group psychotherapy to reduce loneliness using established techniques to target these negative biases. They also investigated whether the effects of this group psychotherapy could be augmented by administering intranasal oxytocin (vs placebo) before the group psychotherapy sessions.

Results showed that the group psychotherapy intervention reduced trait loneliness (loneliness experienced over a prolonged period). The oxytocin did not show a significant effect on trait loneliness, but there was a suggestion that it may enhance the reduction in state loneliness (how someone is feeling at a specific time) brought about by the psychotherapy sessions.

“We found that bonding within the groups was experienced as more positive in the oxytocin treated groups. It is possible that a longer intervention would be helpful for longer-term results,” Scheele concluded. “It’s not going to be a quick fix for loneliness, but there may be a role for oxytocin as an adjunct to psychotherapy.”
 

A Basic Human Need

Another loneliness researcher, Livia Tomova, PhD, assistant professor of psychology at Cardiff University in Wales, has used social isolation to induce loneliness in young people and found that this intervention was linked to brain patterns similar to those associated with hunger.

“We know that the drive to eat food is a very basic human need. We know quite well how it is represented in the brain,” she explained.

The researchers tested how the brains of the participants responded to seeing pictures of social interactions after they underwent a prolonged period of social isolation. In a subsequent session, the same people were asked to undergo food fasting and then underwent brain scans when looking at pictures of food. Results showed that the neural patterns were similar in the two situations with increased activity in the substantia nigra area within the midbrain.

“This area of the brain processes rewards and motivation. It consists primarily of dopamine neurons and increased activity corresponds to a feeling of craving something. So this area of the brain that controls essential homeostatic needs is activated when people feel lonely, suggesting that our need for social contact with others is potentially a very basic need similar to eating,” Tomova said.
 

Lower Gray Matter Volumes in Key Brain Areas

And another group from Germany has found that higher loneliness scores are negatively associated with specific brain regions responsible for memory, emotion regulation, and social processing.

Sandra Düzel, PhD, and colleagues from the Max Planck Institute for Human Development and the Charité – Universitätsmedizin Berlin, both in Berlin, Germany, reported a study in which individuals who reported higher loneliness had smaller gray matter volumes in brain regions such as the left amygdala, anterior hippocampus, and cerebellum, regions which are crucial for both emotional regulation and higher-order cognitive processes, such as self-reflection and executive function.

Düzel believes that possible mechanisms behind the link between loneliness and brain volume differences could include stress-related damage, with prolonged loneliness associated with elevated levels of stress hormones, which can damage the hippocampus over time, and reduced cognitive and social stimulation, which may contribute to brain volume reductions in regions critical for memory and emotional processing.

“Loneliness is often characterized by reduced social and environmental diversity, leading to less engagement with novel experiences and potentially lower hippocampal-striatal connectivity.

Since novelty-seeking and environmental diversity are associated with positive emotional states, individuals experiencing loneliness might benefit from increased exposure to new environments which could stimulate the brain’s reward circuits, fostering positive affect and potentially mitigating the emotional burden of loneliness,” she said.
 

 

 

Is Social Prescribing the Answer?

So are there enough data now to act and attempt to develop interventions to reduce loneliness? Most of these researchers believe so.

“I think we have enough information to act on this now. There are a number of national academies consensus reports, which suggest that, while certainly there are still gaps in our evidence and more to be learned, there is sufficient evidence that a concerning portion of the population seems to lack connection, and that the consequences are serious enough that we need to do something about it,” said Holt-Lunstad.

Some countries have introduced social prescribing where doctors can prescribe a group activity or a regular visit or telephone conversation with a supportive person.

Subramanian pointed out that it’s easier to implement in countries with national health services and may be more difficult to embrace in the US healthcare system.

“We are not so encouraged from a financial perspective to think about preventive care in the US. We don’t have an easy way to recognize in any tangible way the downstream of such activities in terms of preventing future problems. That is something we need to work on,” she said.

Finley cautioned that to work well, social prescribing will require an understanding of each person’s individual situation.

“Some people may only receive benefit of interacting with others if they are also getting some sort of support to address the social and emotional concerns that are tagging along with loneliness. I’m not sure that just telling people to go join their local gardening club or whatever will be the correct answer for everyone.”

She pointed out that many people will have issues in their life that are making it hard for them to be social. These could be mobility or financial challenges, care responsibilities, or concerns about illnesses or life events. “We need to figure out what would have the most bang for the person’s buck, so to speak, as an intervention. That could mean connecting them to a group relevant to their individual situation.”
 

Opportunity to Connect Not Enough?

Tomova believes that training people in social skills may be a better option. “It appears that some people who are chronically lonely seem to struggle to make relationships with others. So just encouraging them to interact with others more will not necessarily help. We need to better understand the pathways involved and who are the people who become ill. We can then develop and target better interventions and teach people coping strategies for that situation.”

Scheele agreed. “While just giving people the opportunity to connect may work for some, others who are experiencing really chronic loneliness may not benefit very much from this unless their negative belief systems are addressed.” He suggested some sort of psychotherapy may be helpful in this situation.

But at least all seem to agree that healthcare providers need to be more aware of loneliness as a health risk factor, try to identify people at risk, and to think about how best to support them.

Holt-Lunstad noted that one of the recommendations in the US Surgeon General’s advisory was to increase the education, training, and resources on loneliness for healthcare providers.

“If we want this to be addressed, we need to give healthcare providers the time, resources, and training in order to do that, otherwise, we are adding one more thing to an already overburdened system. They need to understand how important it is, and how it might help them take care of the patient.”

“Our hope is that we can start to reverse some of the trends that we are seeing, both in terms of the prevalence rates of loneliness, but also that we could start seeing improvements in health and other kinds of outcomes,” she concluded.

Progress is being made in increasing awareness about the dangers of chronic loneliness. It’s now recognized as a serious health risk, but there are actionable steps that can help. Loneliness doesn’t have to be a permanent condition for anyone, said Scheele.

Holt-Lunstad served as an adviser for Foundation for Social Connection, Global Initiative on Loneliness and Connection, and Nextdoor Neighborhood Vitality Board and received research grants/income from Templeton Foundation, Eventbrite, Foundation for Social Connection, and Triple-S Foundation. Subramanian served as a speaker bureau for Acorda Pharma. The other researchers reported no disclosures.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Being a Weekend Warrior Linked to Lower Dementia Risk

Article Type
Changed

 

TOPLINE:

Weekend exercise, involving one or two sessions per week, is associated with a similar reduction in risk for mild dementia as that reported with more frequent exercise, a new study shows. Investigators say the findings suggest even limited physical activity may offer protective cognitive benefits.

METHODOLOGY:

  • Researchers analyzed the data of 10,033 participants in the Mexico City Prospective Study who were aged 35 years or older.
  • Physical activity patterns were categorized into four groups: No exercise, weekend warriors (one or two sessions per week), regularly active (three or more sessions per week), and a combined group.
  • Cognitive function was assessed using the Mini-Mental State Examination (MMSE).
  • The analysis adjusted for confounders such as age, sex, education, income, blood pressure, smoking status, body mass index, civil status, sleep duration, diet, and alcohol intake.
  • The mean follow-up duration was 16 years.

TAKEAWAY:

  • When mild dementia was defined as an MMSE score ≤ 22, dementia prevalence was 26% in those who did not exercise, 14% in weekend warriors, and 18.5% in the regularly active group.
  • When mild dementia was defined as an MMSE score ≤ 23, dementia prevalence was 30% in those who did not exercise, 20% in weekend warriors, and 22% in the regularly active group.
  • Compared with people who did not exercise and after adjusting for confounding factors, risk for mild dementia was 13%-25% lower in weekend warriors, 11%-12% lower in the regular activity group, and 12%-16% lower in the two groups combined.
  • The findings were consistent in men and women.

IN PRACTICE:

“To the best of our knowledge, this is the first prospective cohort study to show that the weekend warrior physical activity pattern and the regularly active physical activity pattern are associated with similar reductions in the risk of mild dementia. This study has important implications for policy and practice because the weekend warrior physical activity pattern may be a more convenient option for busy people around the world,” the authors wrote.

SOURCE:

The study was led by Gary O’Donovan, Faculty of Medicine, University of the Andes, Bogotá, Colombia. It was published online in the British Journal of Sports Medicine.

LIMITATIONS:

The survey respondents may not have been truly representative of middle-aged adults. Further, there were no objective measures of physical activity. The observational nature of the study does not provide insights into causality.

DISCLOSURES:

The study was funded by the Mexican Health Ministry, the National Council of Science and Technology for Mexico, Wellcome, and the UK Medical Research Council. No conflicts of interest were disclosed.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Weekend exercise, involving one or two sessions per week, is associated with a similar reduction in risk for mild dementia as that reported with more frequent exercise, a new study shows. Investigators say the findings suggest even limited physical activity may offer protective cognitive benefits.

METHODOLOGY:

  • Researchers analyzed the data of 10,033 participants in the Mexico City Prospective Study who were aged 35 years or older.
  • Physical activity patterns were categorized into four groups: No exercise, weekend warriors (one or two sessions per week), regularly active (three or more sessions per week), and a combined group.
  • Cognitive function was assessed using the Mini-Mental State Examination (MMSE).
  • The analysis adjusted for confounders such as age, sex, education, income, blood pressure, smoking status, body mass index, civil status, sleep duration, diet, and alcohol intake.
  • The mean follow-up duration was 16 years.

TAKEAWAY:

  • When mild dementia was defined as an MMSE score ≤ 22, dementia prevalence was 26% in those who did not exercise, 14% in weekend warriors, and 18.5% in the regularly active group.
  • When mild dementia was defined as an MMSE score ≤ 23, dementia prevalence was 30% in those who did not exercise, 20% in weekend warriors, and 22% in the regularly active group.
  • Compared with people who did not exercise and after adjusting for confounding factors, risk for mild dementia was 13%-25% lower in weekend warriors, 11%-12% lower in the regular activity group, and 12%-16% lower in the two groups combined.
  • The findings were consistent in men and women.

IN PRACTICE:

“To the best of our knowledge, this is the first prospective cohort study to show that the weekend warrior physical activity pattern and the regularly active physical activity pattern are associated with similar reductions in the risk of mild dementia. This study has important implications for policy and practice because the weekend warrior physical activity pattern may be a more convenient option for busy people around the world,” the authors wrote.

SOURCE:

The study was led by Gary O’Donovan, Faculty of Medicine, University of the Andes, Bogotá, Colombia. It was published online in the British Journal of Sports Medicine.

LIMITATIONS:

The survey respondents may not have been truly representative of middle-aged adults. Further, there were no objective measures of physical activity. The observational nature of the study does not provide insights into causality.

DISCLOSURES:

The study was funded by the Mexican Health Ministry, the National Council of Science and Technology for Mexico, Wellcome, and the UK Medical Research Council. No conflicts of interest were disclosed.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
 

A version of this article appeared on Medscape.com.

 

TOPLINE:

Weekend exercise, involving one or two sessions per week, is associated with a similar reduction in risk for mild dementia as that reported with more frequent exercise, a new study shows. Investigators say the findings suggest even limited physical activity may offer protective cognitive benefits.

METHODOLOGY:

  • Researchers analyzed the data of 10,033 participants in the Mexico City Prospective Study who were aged 35 years or older.
  • Physical activity patterns were categorized into four groups: No exercise, weekend warriors (one or two sessions per week), regularly active (three or more sessions per week), and a combined group.
  • Cognitive function was assessed using the Mini-Mental State Examination (MMSE).
  • The analysis adjusted for confounders such as age, sex, education, income, blood pressure, smoking status, body mass index, civil status, sleep duration, diet, and alcohol intake.
  • The mean follow-up duration was 16 years.

TAKEAWAY:

  • When mild dementia was defined as an MMSE score ≤ 22, dementia prevalence was 26% in those who did not exercise, 14% in weekend warriors, and 18.5% in the regularly active group.
  • When mild dementia was defined as an MMSE score ≤ 23, dementia prevalence was 30% in those who did not exercise, 20% in weekend warriors, and 22% in the regularly active group.
  • Compared with people who did not exercise and after adjusting for confounding factors, risk for mild dementia was 13%-25% lower in weekend warriors, 11%-12% lower in the regular activity group, and 12%-16% lower in the two groups combined.
  • The findings were consistent in men and women.

IN PRACTICE:

“To the best of our knowledge, this is the first prospective cohort study to show that the weekend warrior physical activity pattern and the regularly active physical activity pattern are associated with similar reductions in the risk of mild dementia. This study has important implications for policy and practice because the weekend warrior physical activity pattern may be a more convenient option for busy people around the world,” the authors wrote.

SOURCE:

The study was led by Gary O’Donovan, Faculty of Medicine, University of the Andes, Bogotá, Colombia. It was published online in the British Journal of Sports Medicine.

LIMITATIONS:

The survey respondents may not have been truly representative of middle-aged adults. Further, there were no objective measures of physical activity. The observational nature of the study does not provide insights into causality.

DISCLOSURES:

The study was funded by the Mexican Health Ministry, the National Council of Science and Technology for Mexico, Wellcome, and the UK Medical Research Council. No conflicts of interest were disclosed.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Cannabis Use Linked to Brain Thinning in Adolescents

Article Type
Changed

Cannabis use may lead to thinning of the cerebral cortex in adolescents, research in mice and humans suggested.

The multilevel study demonstrated that tetrahydrocannabinol (THC), an active substance in cannabis, causes shrinkage of dendritic arborization — the neurons’ network of antennae that play a critical role in communication between brain cells.

The connection between dendritic arborization and cortical thickness was hinted at in an earlier study by Tomáš Paus, MD, PhD, professor of psychiatry and addictology at the University of Montreal, Quebec, Canada, and colleagues, who found that cannabis use in early adolescence was associated with lower cortical thickness in boys with a high genetic risk for schizophrenia.

“We speculated at that time that the differences in cortical thickness might be related to differences in dendritic arborization, and our current study confirmed it,” Paus said.

That confirmation came in the mouse part of the study, when coauthor Graciela Piñeyro, MD, PhD, also of the University of Montreal, counted the dendritic branches of mice exposed to THC and compared the total with the number of dendritic branches in unexposed mice. “What surprised me was finding that THC in the mice was targeting the same type of cells and structures that Dr. Paus had predicted would be affected from the human studies,” she said. “Structurally, they were mostly the neurons that contribute to synapses in the cortex, and their branching was reduced.”

Paus explained that in humans, a decrease in input from the affected dendrites “makes it harder for the brain to learn new things, interact with people, cope with new situations, et cetera. In other words, it makes the brain more vulnerable to everything that can happen in a young person’s life.”

The study was published online on October 9 in the Journal of Neuroscience.
 

Of Mice, Men, and Cannabis

Although associations between cannabis use by teenagers and variations in brain maturation have been well studied, the cellular and molecular underpinnings of these associations were unclear, according to the authors.

To investigate further, they conducted this three-step study. First, they exposed adolescent male mice to THC or a synthetic cannabinoid (WIN 55,212-2) and assessed differentially expressed genes, spine numbers, and the extent of dendritic complexity in the frontal cortex of each mouse.

Next, using MRI, they examined differences in cortical thickness in 34 brain regions in 140 male adolescents who experimented with cannabis before age 16 years and 327 who did not.

Then, they again conducted experiments in mice and found that 13 THC-related genes correlated with variations in cortical thickness. Virtual histology revealed that these 13 genes were coexpressed with cell markers of astrocytes, microglia, and a type of pyramidal cell enriched in genes that regulate dendritic expression.

Similarly, the WIN-related genes correlated with differences in cortical thickness and showed coexpression patterns with the same three cell types.

Furthermore, the affected genes were also found in humans, particularly in the thinner cortical regions of the adolescents who experimented with cannabis.

By acting on microglia, THC seems to promote the removal of synapses and, eventually, the reduction of the dendritic tree in mice, Piñeyro explained. That’s important not only because a similar mechanism may be at work in humans but also because “we now might have a model to test different types of cannabis products to see which ones are producing the greatest effect on neurons and therefore greater removal of synapses through the microglia. This could be a way of testing drugs that are out in the street to see which would be the most or least dangerous to the synapses in the brain.”
 

 

 

‘Significant Implications’

Commenting on the study, Yasmin Hurd, PhD, Ward-Coleman chair of translational neuroscience at the Icahn School of Medicine at Mount Sinai and director of the Addiction Institute of Mount Sinai in New York City, said, “These findings are in line with previous results, so they are feasible. This study adds more depth by showing that cortical genes that were differentially altered by adolescent THC correlated with cannabis-related changes in cortical thickness based on human neuroimaging data.” Hurd did not participate in the research.

“The results emphasize that consumption of potent cannabis products during adolescence can impact cortical function, which has significant implications for decision-making and risky behavior as well. It also can increase vulnerability to psychiatric disorders such as schizophrenia.”

Although a mouse model is “not truly the same as the human condition, the fact that the animal model also showed evidence of the morphological changes indicative of reduced cortical thickness, [like] the humans, is strong,” she said.

Additional research could include women and assess potential sex differences, she added.

Ronald Ellis, MD, PhD, an investigator in the Center for Medicinal Cannabis Research at the University of California, San Diego School of Medicine, said, “The findings are plausible and extend prior work showing evidence of increased risk for psychotic disorders later in life in adolescents who use cannabis.” Ellis did not participate in the research.

“Future studies should explore how these findings might vary across different demographic groups, which could provide a more inclusive understanding of how cannabis impacts the brain,” he said. “Additionally, longitudinal studies to track changes in the brain over time could help to establish causal relationships more robustly.

“The take-home message to clinicians at this point is to discuss cannabis use history carefully and confidentially with adolescent patients to better provide advice on its potential risks,” he concluded.

Paus added that he would tell patients, “If you’re going to use cannabis, don’t start early. If you have to, then do so in moderation. And if you have family history of mental illness, be very careful.”

No funding for the study was reported. Paus, Piñeyro, Hurd, and Ellis declared having no relevant financial relationships. 
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Cannabis use may lead to thinning of the cerebral cortex in adolescents, research in mice and humans suggested.

The multilevel study demonstrated that tetrahydrocannabinol (THC), an active substance in cannabis, causes shrinkage of dendritic arborization — the neurons’ network of antennae that play a critical role in communication between brain cells.

The connection between dendritic arborization and cortical thickness was hinted at in an earlier study by Tomáš Paus, MD, PhD, professor of psychiatry and addictology at the University of Montreal, Quebec, Canada, and colleagues, who found that cannabis use in early adolescence was associated with lower cortical thickness in boys with a high genetic risk for schizophrenia.

“We speculated at that time that the differences in cortical thickness might be related to differences in dendritic arborization, and our current study confirmed it,” Paus said.

That confirmation came in the mouse part of the study, when coauthor Graciela Piñeyro, MD, PhD, also of the University of Montreal, counted the dendritic branches of mice exposed to THC and compared the total with the number of dendritic branches in unexposed mice. “What surprised me was finding that THC in the mice was targeting the same type of cells and structures that Dr. Paus had predicted would be affected from the human studies,” she said. “Structurally, they were mostly the neurons that contribute to synapses in the cortex, and their branching was reduced.”

Paus explained that in humans, a decrease in input from the affected dendrites “makes it harder for the brain to learn new things, interact with people, cope with new situations, et cetera. In other words, it makes the brain more vulnerable to everything that can happen in a young person’s life.”

The study was published online on October 9 in the Journal of Neuroscience.
 

Of Mice, Men, and Cannabis

Although associations between cannabis use by teenagers and variations in brain maturation have been well studied, the cellular and molecular underpinnings of these associations were unclear, according to the authors.

To investigate further, they conducted this three-step study. First, they exposed adolescent male mice to THC or a synthetic cannabinoid (WIN 55,212-2) and assessed differentially expressed genes, spine numbers, and the extent of dendritic complexity in the frontal cortex of each mouse.

Next, using MRI, they examined differences in cortical thickness in 34 brain regions in 140 male adolescents who experimented with cannabis before age 16 years and 327 who did not.

Then, they again conducted experiments in mice and found that 13 THC-related genes correlated with variations in cortical thickness. Virtual histology revealed that these 13 genes were coexpressed with cell markers of astrocytes, microglia, and a type of pyramidal cell enriched in genes that regulate dendritic expression.

Similarly, the WIN-related genes correlated with differences in cortical thickness and showed coexpression patterns with the same three cell types.

Furthermore, the affected genes were also found in humans, particularly in the thinner cortical regions of the adolescents who experimented with cannabis.

By acting on microglia, THC seems to promote the removal of synapses and, eventually, the reduction of the dendritic tree in mice, Piñeyro explained. That’s important not only because a similar mechanism may be at work in humans but also because “we now might have a model to test different types of cannabis products to see which ones are producing the greatest effect on neurons and therefore greater removal of synapses through the microglia. This could be a way of testing drugs that are out in the street to see which would be the most or least dangerous to the synapses in the brain.”
 

 

 

‘Significant Implications’

Commenting on the study, Yasmin Hurd, PhD, Ward-Coleman chair of translational neuroscience at the Icahn School of Medicine at Mount Sinai and director of the Addiction Institute of Mount Sinai in New York City, said, “These findings are in line with previous results, so they are feasible. This study adds more depth by showing that cortical genes that were differentially altered by adolescent THC correlated with cannabis-related changes in cortical thickness based on human neuroimaging data.” Hurd did not participate in the research.

“The results emphasize that consumption of potent cannabis products during adolescence can impact cortical function, which has significant implications for decision-making and risky behavior as well. It also can increase vulnerability to psychiatric disorders such as schizophrenia.”

Although a mouse model is “not truly the same as the human condition, the fact that the animal model also showed evidence of the morphological changes indicative of reduced cortical thickness, [like] the humans, is strong,” she said.

Additional research could include women and assess potential sex differences, she added.

Ronald Ellis, MD, PhD, an investigator in the Center for Medicinal Cannabis Research at the University of California, San Diego School of Medicine, said, “The findings are plausible and extend prior work showing evidence of increased risk for psychotic disorders later in life in adolescents who use cannabis.” Ellis did not participate in the research.

“Future studies should explore how these findings might vary across different demographic groups, which could provide a more inclusive understanding of how cannabis impacts the brain,” he said. “Additionally, longitudinal studies to track changes in the brain over time could help to establish causal relationships more robustly.

“The take-home message to clinicians at this point is to discuss cannabis use history carefully and confidentially with adolescent patients to better provide advice on its potential risks,” he concluded.

Paus added that he would tell patients, “If you’re going to use cannabis, don’t start early. If you have to, then do so in moderation. And if you have family history of mental illness, be very careful.”

No funding for the study was reported. Paus, Piñeyro, Hurd, and Ellis declared having no relevant financial relationships. 
 

A version of this article appeared on Medscape.com.

Cannabis use may lead to thinning of the cerebral cortex in adolescents, research in mice and humans suggested.

The multilevel study demonstrated that tetrahydrocannabinol (THC), an active substance in cannabis, causes shrinkage of dendritic arborization — the neurons’ network of antennae that play a critical role in communication between brain cells.

The connection between dendritic arborization and cortical thickness was hinted at in an earlier study by Tomáš Paus, MD, PhD, professor of psychiatry and addictology at the University of Montreal, Quebec, Canada, and colleagues, who found that cannabis use in early adolescence was associated with lower cortical thickness in boys with a high genetic risk for schizophrenia.

“We speculated at that time that the differences in cortical thickness might be related to differences in dendritic arborization, and our current study confirmed it,” Paus said.

That confirmation came in the mouse part of the study, when coauthor Graciela Piñeyro, MD, PhD, also of the University of Montreal, counted the dendritic branches of mice exposed to THC and compared the total with the number of dendritic branches in unexposed mice. “What surprised me was finding that THC in the mice was targeting the same type of cells and structures that Dr. Paus had predicted would be affected from the human studies,” she said. “Structurally, they were mostly the neurons that contribute to synapses in the cortex, and their branching was reduced.”

Paus explained that in humans, a decrease in input from the affected dendrites “makes it harder for the brain to learn new things, interact with people, cope with new situations, et cetera. In other words, it makes the brain more vulnerable to everything that can happen in a young person’s life.”

The study was published online on October 9 in the Journal of Neuroscience.
 

Of Mice, Men, and Cannabis

Although associations between cannabis use by teenagers and variations in brain maturation have been well studied, the cellular and molecular underpinnings of these associations were unclear, according to the authors.

To investigate further, they conducted this three-step study. First, they exposed adolescent male mice to THC or a synthetic cannabinoid (WIN 55,212-2) and assessed differentially expressed genes, spine numbers, and the extent of dendritic complexity in the frontal cortex of each mouse.

Next, using MRI, they examined differences in cortical thickness in 34 brain regions in 140 male adolescents who experimented with cannabis before age 16 years and 327 who did not.

Then, they again conducted experiments in mice and found that 13 THC-related genes correlated with variations in cortical thickness. Virtual histology revealed that these 13 genes were coexpressed with cell markers of astrocytes, microglia, and a type of pyramidal cell enriched in genes that regulate dendritic expression.

Similarly, the WIN-related genes correlated with differences in cortical thickness and showed coexpression patterns with the same three cell types.

Furthermore, the affected genes were also found in humans, particularly in the thinner cortical regions of the adolescents who experimented with cannabis.

By acting on microglia, THC seems to promote the removal of synapses and, eventually, the reduction of the dendritic tree in mice, Piñeyro explained. That’s important not only because a similar mechanism may be at work in humans but also because “we now might have a model to test different types of cannabis products to see which ones are producing the greatest effect on neurons and therefore greater removal of synapses through the microglia. This could be a way of testing drugs that are out in the street to see which would be the most or least dangerous to the synapses in the brain.”
 

 

 

‘Significant Implications’

Commenting on the study, Yasmin Hurd, PhD, Ward-Coleman chair of translational neuroscience at the Icahn School of Medicine at Mount Sinai and director of the Addiction Institute of Mount Sinai in New York City, said, “These findings are in line with previous results, so they are feasible. This study adds more depth by showing that cortical genes that were differentially altered by adolescent THC correlated with cannabis-related changes in cortical thickness based on human neuroimaging data.” Hurd did not participate in the research.

“The results emphasize that consumption of potent cannabis products during adolescence can impact cortical function, which has significant implications for decision-making and risky behavior as well. It also can increase vulnerability to psychiatric disorders such as schizophrenia.”

Although a mouse model is “not truly the same as the human condition, the fact that the animal model also showed evidence of the morphological changes indicative of reduced cortical thickness, [like] the humans, is strong,” she said.

Additional research could include women and assess potential sex differences, she added.

Ronald Ellis, MD, PhD, an investigator in the Center for Medicinal Cannabis Research at the University of California, San Diego School of Medicine, said, “The findings are plausible and extend prior work showing evidence of increased risk for psychotic disorders later in life in adolescents who use cannabis.” Ellis did not participate in the research.

“Future studies should explore how these findings might vary across different demographic groups, which could provide a more inclusive understanding of how cannabis impacts the brain,” he said. “Additionally, longitudinal studies to track changes in the brain over time could help to establish causal relationships more robustly.

“The take-home message to clinicians at this point is to discuss cannabis use history carefully and confidentially with adolescent patients to better provide advice on its potential risks,” he concluded.

Paus added that he would tell patients, “If you’re going to use cannabis, don’t start early. If you have to, then do so in moderation. And if you have family history of mental illness, be very careful.”

No funding for the study was reported. Paus, Piñeyro, Hurd, and Ellis declared having no relevant financial relationships. 
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF NEUROSCIENCE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Novel Intervention Slows Cognitive Decline in At-Risk Adults

Article Type
Changed

Combining cognitive remediation with transcranial direct current stimulation (tDCS) was associated with slower cognitive decline for up to 6 years in older adults with major depressive disorder that is in remission (rMDD), mild cognitive impairment (MCI), or both, new research suggests.

The cognitive remediation intervention included a series of progressively difficult computer-based and facilitator-monitored mental exercises designed to sharpen cognitive function. 

Researchers found that using cognitive remediation with tDCS slowed decline in executive function and verbal memory more than other cognitive functions. The effect was stronger among people with rMDD versus those with MCI and in those at low genetic risk for Alzheimer’s disease. 

“We have developed a novel intervention, combining two interventions that if used separately have a weak effect but together have substantial and clinically meaningful effect of slowing the progression of cognitive decline,” said study author Benoit H. Mulsant, MD, chair of the Department of Psychiatry, University of Toronto, Ontario, Canada, and senior scientist at the Center for Addiction and Mental Health, also in Toronto. 

The findings were published online in JAMA Psychiatry
 

High-Risk Group

Research shows that older adults with MDD or MCI are at high risk for cognitive decline and dementia. Evidence also suggests that depression in early or mid-life significantly increases the risk for dementia in late life, even if the depression has been in remission for decades.

A potential mechanism underlying this increased risk for dementia could be impaired cortical plasticity, or the ability of the brain to compensate for damage.

The PACt-MD trial included 375 older adults with rMDD, MCI, or both (mean age, 72 years; 62% women) at five academic hospitals in Toronto.

Participants received either cognitive remediation plus tDCS or sham intervention 5 days per week for 8 weeks (acute phase), followed by 5-day “boosters” every 6 months.

tDCS was administered by trained personnel and involved active stimulation for 30 minutes at the beginning of each cognitive remediation group session. The intervention targets the prefrontal cortex, a critical region for cognitive compensation in normal cognitive aging.

The sham group received a weakened version of cognitive remediation, with exercises that did not get progressively more difficult. For the sham stimulation, the current flowed at full intensity for only 54 seconds before and after 30-second ramp-up and ramp-down phases, to create a blinding effect, the authors noted. 

A geriatric psychiatrist followed all participants throughout the study, conducting assessments at baseline, month 2, and yearly for 3-7 years (mean follow-up, 48.3 months). 

Participants’ depressive symptoms were evaluated at baseline and at all follow-ups and underwent neuropsychological testing to assess six cognitive domains: processing speed, working memory, executive functioning, verbal memory, visual memory, and language.

To get a norm for the cognitive tests, researchers recruited a comparator group of 75 subjects similar in age, gender, and years of education, with no neuropsychiatric disorder or cognitive impairment. They completed the same assessments but not the intervention.

Study participants and assessors were blinded to treatment assignment.
 

Slower Cognitive Decline

Participants in the intervention group had a significantly slower decline in cognitive function, compared with those in the sham group (adjusted z score difference [active – sham] at month 60, 0.21; P = .006). This is equivalent to slowing cognitive decline by about 4 years, researchers reported. The intervention also showed a positive effect on executive function and verbal memory. 

“If I can push dementia from 85 to 89 years and you die at 86, in practice, I have prevented you from ever developing dementia,” Mulsant said.

The efficacy of cognitive remediation plus tDCS in rMDD could be tied to enhanced neuroplasticity, said Mulsant. 

The treatment worked well in people with a history of depression, regardless of MCI status, but was not as effective for people with just MCI, researchers noted. The intervention also did not work as well among people at genetic risk for Alzheimer’s disease.

“We don’t believe we have discovered an intervention to prevent dementia in people who are at high risk for Alzheimer disease, but we have discovered an intervention that could prevent dementia in people who have an history of depression,” said Mulsant. 

These results suggest the pathways to dementia among people with MCI and rMDD are different, he added. 

Because previous research showed either treatment alone demonstrated little efficacy, researchers said the new results indicate that there may be a synergistic effect of combining the two. 

The ideal amount of treatment and optimal age for initiation still need to be determined, said Mulsant. The study did not include a comparator group without rMDD or MCI, so the observed cognitive benefits might be specific to people with these high-risk conditions. Another study limitation is lack of diversity in terms of ethnicity, race, and education. 
 

Promising, Important Findings

Commenting on the research, Badr Ratnakaran, MD, assistant professor and division director of geriatric psychiatry at Carilion Clinic–Virginia Tech Carilion School of Medicine, Roanoke, said the results are promising and important because there are so few treatment options for the increasing number of older patients with depression and dementia.

The side-effect profile of the combined treatment is better than that of many pharmacologic treatments, Ratnakaran noted. As more research like this comes out, Ratnakaran predicts that cognitive remediation and tCDS will become more readily available.

“This is telling us that the field of psychiatry, and also dementia, is progressing beyond your usual pharmacotherapy treatments,” said Ratnakaran, who also is chair of the American Psychiatric Association’s Council on Geriatric Psychiatry. 

The study received support from the Canada Brain Research Fund of Brain Canada, Health Canada, the Chagnon Family, and the Centre for Addiction and Mental Health Discovery Fund. Mulsant reported holding and receiving support from the Labatt Family Chair in Biology of Depression in Late-Life Adults at the University of Toronto; being a member of the Center for Addiction and Mental Health Board of Trustees; research support from Brain Canada, Canadian Institutes of Health Research, Center for Addiction and Mental Health Foundation, Patient-Centered Outcomes Research Institute, and National Institutes of Health; and nonfinancial support from Capital Solution Design and HappyNeuron. Ratnakaran reported no relevant conflicts.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Combining cognitive remediation with transcranial direct current stimulation (tDCS) was associated with slower cognitive decline for up to 6 years in older adults with major depressive disorder that is in remission (rMDD), mild cognitive impairment (MCI), or both, new research suggests.

The cognitive remediation intervention included a series of progressively difficult computer-based and facilitator-monitored mental exercises designed to sharpen cognitive function. 

Researchers found that using cognitive remediation with tDCS slowed decline in executive function and verbal memory more than other cognitive functions. The effect was stronger among people with rMDD versus those with MCI and in those at low genetic risk for Alzheimer’s disease. 

“We have developed a novel intervention, combining two interventions that if used separately have a weak effect but together have substantial and clinically meaningful effect of slowing the progression of cognitive decline,” said study author Benoit H. Mulsant, MD, chair of the Department of Psychiatry, University of Toronto, Ontario, Canada, and senior scientist at the Center for Addiction and Mental Health, also in Toronto. 

The findings were published online in JAMA Psychiatry
 

High-Risk Group

Research shows that older adults with MDD or MCI are at high risk for cognitive decline and dementia. Evidence also suggests that depression in early or mid-life significantly increases the risk for dementia in late life, even if the depression has been in remission for decades.

A potential mechanism underlying this increased risk for dementia could be impaired cortical plasticity, or the ability of the brain to compensate for damage.

The PACt-MD trial included 375 older adults with rMDD, MCI, or both (mean age, 72 years; 62% women) at five academic hospitals in Toronto.

Participants received either cognitive remediation plus tDCS or sham intervention 5 days per week for 8 weeks (acute phase), followed by 5-day “boosters” every 6 months.

tDCS was administered by trained personnel and involved active stimulation for 30 minutes at the beginning of each cognitive remediation group session. The intervention targets the prefrontal cortex, a critical region for cognitive compensation in normal cognitive aging.

The sham group received a weakened version of cognitive remediation, with exercises that did not get progressively more difficult. For the sham stimulation, the current flowed at full intensity for only 54 seconds before and after 30-second ramp-up and ramp-down phases, to create a blinding effect, the authors noted. 

A geriatric psychiatrist followed all participants throughout the study, conducting assessments at baseline, month 2, and yearly for 3-7 years (mean follow-up, 48.3 months). 

Participants’ depressive symptoms were evaluated at baseline and at all follow-ups and underwent neuropsychological testing to assess six cognitive domains: processing speed, working memory, executive functioning, verbal memory, visual memory, and language.

To get a norm for the cognitive tests, researchers recruited a comparator group of 75 subjects similar in age, gender, and years of education, with no neuropsychiatric disorder or cognitive impairment. They completed the same assessments but not the intervention.

Study participants and assessors were blinded to treatment assignment.
 

Slower Cognitive Decline

Participants in the intervention group had a significantly slower decline in cognitive function, compared with those in the sham group (adjusted z score difference [active – sham] at month 60, 0.21; P = .006). This is equivalent to slowing cognitive decline by about 4 years, researchers reported. The intervention also showed a positive effect on executive function and verbal memory. 

“If I can push dementia from 85 to 89 years and you die at 86, in practice, I have prevented you from ever developing dementia,” Mulsant said.

The efficacy of cognitive remediation plus tDCS in rMDD could be tied to enhanced neuroplasticity, said Mulsant. 

The treatment worked well in people with a history of depression, regardless of MCI status, but was not as effective for people with just MCI, researchers noted. The intervention also did not work as well among people at genetic risk for Alzheimer’s disease.

“We don’t believe we have discovered an intervention to prevent dementia in people who are at high risk for Alzheimer disease, but we have discovered an intervention that could prevent dementia in people who have an history of depression,” said Mulsant. 

These results suggest the pathways to dementia among people with MCI and rMDD are different, he added. 

Because previous research showed either treatment alone demonstrated little efficacy, researchers said the new results indicate that there may be a synergistic effect of combining the two. 

The ideal amount of treatment and optimal age for initiation still need to be determined, said Mulsant. The study did not include a comparator group without rMDD or MCI, so the observed cognitive benefits might be specific to people with these high-risk conditions. Another study limitation is lack of diversity in terms of ethnicity, race, and education. 
 

Promising, Important Findings

Commenting on the research, Badr Ratnakaran, MD, assistant professor and division director of geriatric psychiatry at Carilion Clinic–Virginia Tech Carilion School of Medicine, Roanoke, said the results are promising and important because there are so few treatment options for the increasing number of older patients with depression and dementia.

The side-effect profile of the combined treatment is better than that of many pharmacologic treatments, Ratnakaran noted. As more research like this comes out, Ratnakaran predicts that cognitive remediation and tCDS will become more readily available.

“This is telling us that the field of psychiatry, and also dementia, is progressing beyond your usual pharmacotherapy treatments,” said Ratnakaran, who also is chair of the American Psychiatric Association’s Council on Geriatric Psychiatry. 

The study received support from the Canada Brain Research Fund of Brain Canada, Health Canada, the Chagnon Family, and the Centre for Addiction and Mental Health Discovery Fund. Mulsant reported holding and receiving support from the Labatt Family Chair in Biology of Depression in Late-Life Adults at the University of Toronto; being a member of the Center for Addiction and Mental Health Board of Trustees; research support from Brain Canada, Canadian Institutes of Health Research, Center for Addiction and Mental Health Foundation, Patient-Centered Outcomes Research Institute, and National Institutes of Health; and nonfinancial support from Capital Solution Design and HappyNeuron. Ratnakaran reported no relevant conflicts.

A version of this article appeared on Medscape.com.

Combining cognitive remediation with transcranial direct current stimulation (tDCS) was associated with slower cognitive decline for up to 6 years in older adults with major depressive disorder that is in remission (rMDD), mild cognitive impairment (MCI), or both, new research suggests.

The cognitive remediation intervention included a series of progressively difficult computer-based and facilitator-monitored mental exercises designed to sharpen cognitive function. 

Researchers found that using cognitive remediation with tDCS slowed decline in executive function and verbal memory more than other cognitive functions. The effect was stronger among people with rMDD versus those with MCI and in those at low genetic risk for Alzheimer’s disease. 

“We have developed a novel intervention, combining two interventions that if used separately have a weak effect but together have substantial and clinically meaningful effect of slowing the progression of cognitive decline,” said study author Benoit H. Mulsant, MD, chair of the Department of Psychiatry, University of Toronto, Ontario, Canada, and senior scientist at the Center for Addiction and Mental Health, also in Toronto. 

The findings were published online in JAMA Psychiatry
 

High-Risk Group

Research shows that older adults with MDD or MCI are at high risk for cognitive decline and dementia. Evidence also suggests that depression in early or mid-life significantly increases the risk for dementia in late life, even if the depression has been in remission for decades.

A potential mechanism underlying this increased risk for dementia could be impaired cortical plasticity, or the ability of the brain to compensate for damage.

The PACt-MD trial included 375 older adults with rMDD, MCI, or both (mean age, 72 years; 62% women) at five academic hospitals in Toronto.

Participants received either cognitive remediation plus tDCS or sham intervention 5 days per week for 8 weeks (acute phase), followed by 5-day “boosters” every 6 months.

tDCS was administered by trained personnel and involved active stimulation for 30 minutes at the beginning of each cognitive remediation group session. The intervention targets the prefrontal cortex, a critical region for cognitive compensation in normal cognitive aging.

The sham group received a weakened version of cognitive remediation, with exercises that did not get progressively more difficult. For the sham stimulation, the current flowed at full intensity for only 54 seconds before and after 30-second ramp-up and ramp-down phases, to create a blinding effect, the authors noted. 

A geriatric psychiatrist followed all participants throughout the study, conducting assessments at baseline, month 2, and yearly for 3-7 years (mean follow-up, 48.3 months). 

Participants’ depressive symptoms were evaluated at baseline and at all follow-ups and underwent neuropsychological testing to assess six cognitive domains: processing speed, working memory, executive functioning, verbal memory, visual memory, and language.

To get a norm for the cognitive tests, researchers recruited a comparator group of 75 subjects similar in age, gender, and years of education, with no neuropsychiatric disorder or cognitive impairment. They completed the same assessments but not the intervention.

Study participants and assessors were blinded to treatment assignment.
 

Slower Cognitive Decline

Participants in the intervention group had a significantly slower decline in cognitive function, compared with those in the sham group (adjusted z score difference [active – sham] at month 60, 0.21; P = .006). This is equivalent to slowing cognitive decline by about 4 years, researchers reported. The intervention also showed a positive effect on executive function and verbal memory. 

“If I can push dementia from 85 to 89 years and you die at 86, in practice, I have prevented you from ever developing dementia,” Mulsant said.

The efficacy of cognitive remediation plus tDCS in rMDD could be tied to enhanced neuroplasticity, said Mulsant. 

The treatment worked well in people with a history of depression, regardless of MCI status, but was not as effective for people with just MCI, researchers noted. The intervention also did not work as well among people at genetic risk for Alzheimer’s disease.

“We don’t believe we have discovered an intervention to prevent dementia in people who are at high risk for Alzheimer disease, but we have discovered an intervention that could prevent dementia in people who have an history of depression,” said Mulsant. 

These results suggest the pathways to dementia among people with MCI and rMDD are different, he added. 

Because previous research showed either treatment alone demonstrated little efficacy, researchers said the new results indicate that there may be a synergistic effect of combining the two. 

The ideal amount of treatment and optimal age for initiation still need to be determined, said Mulsant. The study did not include a comparator group without rMDD or MCI, so the observed cognitive benefits might be specific to people with these high-risk conditions. Another study limitation is lack of diversity in terms of ethnicity, race, and education. 
 

Promising, Important Findings

Commenting on the research, Badr Ratnakaran, MD, assistant professor and division director of geriatric psychiatry at Carilion Clinic–Virginia Tech Carilion School of Medicine, Roanoke, said the results are promising and important because there are so few treatment options for the increasing number of older patients with depression and dementia.

The side-effect profile of the combined treatment is better than that of many pharmacologic treatments, Ratnakaran noted. As more research like this comes out, Ratnakaran predicts that cognitive remediation and tCDS will become more readily available.

“This is telling us that the field of psychiatry, and also dementia, is progressing beyond your usual pharmacotherapy treatments,” said Ratnakaran, who also is chair of the American Psychiatric Association’s Council on Geriatric Psychiatry. 

The study received support from the Canada Brain Research Fund of Brain Canada, Health Canada, the Chagnon Family, and the Centre for Addiction and Mental Health Discovery Fund. Mulsant reported holding and receiving support from the Labatt Family Chair in Biology of Depression in Late-Life Adults at the University of Toronto; being a member of the Center for Addiction and Mental Health Board of Trustees; research support from Brain Canada, Canadian Institutes of Health Research, Center for Addiction and Mental Health Foundation, Patient-Centered Outcomes Research Institute, and National Institutes of Health; and nonfinancial support from Capital Solution Design and HappyNeuron. Ratnakaran reported no relevant conflicts.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA PSYCHIATRY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Un-Gate On Date
Use ProPublica
CFC Schedule Remove Status
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
survey writer start date

A Finger-Prick Test for Alzheimer’s Disease?

Article Type
Changed

A finger-prick blood test can accurately identify p-tau217 — a key biomarker of Alzheimer’s disease — without the need for temperature or storage control measures.

In a pilot study, researchers found a good correlation of p-tau217 levels from blood obtained via standard venous sampling and from a single finger prick.

“We see the potential that capillary p-tau217 from dried blood spots could overcome the limitations of standard venous collection of being invasive, dependent on centrifuges and ultra-low temperature freezers, and also requiring less volume than standard plasma analysis,” said lead investigator Hanna Huber, PhD, Department of Psychiatry and Neurochemistry, Institute of Neuroscience and Physiology, University of Gothenburg, Sweden. 

The findings were presented at the 17th Clinical Trials on Alzheimer’s Disease (CTAD) conference.
 

Strong Link Between Venous and Capillary Samples 

p-tau217 has emerged as the most effective blood test to identify Alzheimer’s disease. However, traditional venous blood sampling requires certain infrastructure and immediate processing. Increased and simplified access to this blood biomarker could be crucial for early diagnosis, proper patient management, and prompt initiation of disease-modifying treatments. 

The DROP-AD project is investigating the diagnostic performance of finger-prick collection to accurately measure p-tau217. In the current study, the research team obtained paired venous blood and capillary blood samples from 206 adults (mean age, 71.8 years; 59% women), with or without cognitive impairment, from five European centers. A subset of participants provided a second finger-prick sample collected without any supervision. 

The capillary blood samples were obtained via a single finger prick, and then single blood drops were applied to a dried plasma spot (DPS) card, which was then shipped to a lab (without temperature control or cooling) for p-tau217 measurement. Cerebrospinal fluid biomarkers were available for a subset of individuals.

Throughout the entire study population, there was a “very convincing correlation” between p-tau217 levels from capillary DPS and venous plasma, Huber told conference attendees. 

Additionally, capillary DPS p-tau217 levels were able to discriminate amyloid-positive from amyloid-negative individuals, with levels of this biomarker increasing in a stepwise fashion, “from cognitively unimpaired individuals to individuals with mild cognitive impairment and, finally, to dementia patients,” Huber said.

Of note, capillary p-tau217 levels from DPS samples that were collected by research staff did not differ from unsupervised self-collected samples. 

What about the stability of the samples? Capillary DPS p-tau-217 is “stable over 2 weeks at room temperature,” Huber said. 
 

Ready for Prime Time?

Preliminary data from the DROP-AD project highlight the potential of using finger-prick blood collection to identify neurofilament light (NfL) and glial fibrillary acidic protein (GFAP), two other Alzheimer’s disease biomarkers.

“We think that capillary p-tau217, but also other biomarkers, could be a widely accessible and cheap alternative for clinical practice and clinical trials in individuals with cognitive decline if the results are confirmed in longitudinal and home-sampling cohorts,” Huber concluded. 

“Measuring biomarkers by a simple finger prick could facilitate regular and autonomous sampling at home, which would be particularly useful in remote and rural settings,” she noted. 

The findings in this study confirm and extend earlier findings that the study team reported last year at the Alzheimer’s Association International Conference (AAIC). 

“The data shared at CTAD 2024, along with the related material previously presented at AAIC 2023, reporting on a ‘finger prick’ blood test approach is interesting and emerging work but not yet ready for clinical use,” said Rebecca M. Edelmayer, PhD, Alzheimer’s Association vice president of scientific engagement.

“That said, the idea of a highly accessible and scalable tool that can aid in easier and more equitable diagnosis would be welcomed by researchers, clinicians, and individuals and families affected by Alzheimer’s disease and all other dementias,” Edelmayer said.

“This finger-prick blood testing technology for Alzheimer’s biomarkers still has to be validated more broadly, but it is very promising. Advancements in technology and practice demonstrate the simplicity, transportability, and diagnostic value of blood-based biomarkers for Alzheimer’s,” she added. 

The Alzheimer’s Association is currently conducting a systematic review of the evidence and preparing clinical practice guidelines on blood-based biomarker tests for specialized healthcare settings, with publications, clinical resources, and tools anticipated in 2025, Edelmayer noted. 

The study had no commercial funding. Huber and Edelmayer report no relevant conflicts of interest. 
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

A finger-prick blood test can accurately identify p-tau217 — a key biomarker of Alzheimer’s disease — without the need for temperature or storage control measures.

In a pilot study, researchers found a good correlation of p-tau217 levels from blood obtained via standard venous sampling and from a single finger prick.

“We see the potential that capillary p-tau217 from dried blood spots could overcome the limitations of standard venous collection of being invasive, dependent on centrifuges and ultra-low temperature freezers, and also requiring less volume than standard plasma analysis,” said lead investigator Hanna Huber, PhD, Department of Psychiatry and Neurochemistry, Institute of Neuroscience and Physiology, University of Gothenburg, Sweden. 

The findings were presented at the 17th Clinical Trials on Alzheimer’s Disease (CTAD) conference.
 

Strong Link Between Venous and Capillary Samples 

p-tau217 has emerged as the most effective blood test to identify Alzheimer’s disease. However, traditional venous blood sampling requires certain infrastructure and immediate processing. Increased and simplified access to this blood biomarker could be crucial for early diagnosis, proper patient management, and prompt initiation of disease-modifying treatments. 

The DROP-AD project is investigating the diagnostic performance of finger-prick collection to accurately measure p-tau217. In the current study, the research team obtained paired venous blood and capillary blood samples from 206 adults (mean age, 71.8 years; 59% women), with or without cognitive impairment, from five European centers. A subset of participants provided a second finger-prick sample collected without any supervision. 

The capillary blood samples were obtained via a single finger prick, and then single blood drops were applied to a dried plasma spot (DPS) card, which was then shipped to a lab (without temperature control or cooling) for p-tau217 measurement. Cerebrospinal fluid biomarkers were available for a subset of individuals.

Throughout the entire study population, there was a “very convincing correlation” between p-tau217 levels from capillary DPS and venous plasma, Huber told conference attendees. 

Additionally, capillary DPS p-tau217 levels were able to discriminate amyloid-positive from amyloid-negative individuals, with levels of this biomarker increasing in a stepwise fashion, “from cognitively unimpaired individuals to individuals with mild cognitive impairment and, finally, to dementia patients,” Huber said.

Of note, capillary p-tau217 levels from DPS samples that were collected by research staff did not differ from unsupervised self-collected samples. 

What about the stability of the samples? Capillary DPS p-tau-217 is “stable over 2 weeks at room temperature,” Huber said. 
 

Ready for Prime Time?

Preliminary data from the DROP-AD project highlight the potential of using finger-prick blood collection to identify neurofilament light (NfL) and glial fibrillary acidic protein (GFAP), two other Alzheimer’s disease biomarkers.

“We think that capillary p-tau217, but also other biomarkers, could be a widely accessible and cheap alternative for clinical practice and clinical trials in individuals with cognitive decline if the results are confirmed in longitudinal and home-sampling cohorts,” Huber concluded. 

“Measuring biomarkers by a simple finger prick could facilitate regular and autonomous sampling at home, which would be particularly useful in remote and rural settings,” she noted. 

The findings in this study confirm and extend earlier findings that the study team reported last year at the Alzheimer’s Association International Conference (AAIC). 

“The data shared at CTAD 2024, along with the related material previously presented at AAIC 2023, reporting on a ‘finger prick’ blood test approach is interesting and emerging work but not yet ready for clinical use,” said Rebecca M. Edelmayer, PhD, Alzheimer’s Association vice president of scientific engagement.

“That said, the idea of a highly accessible and scalable tool that can aid in easier and more equitable diagnosis would be welcomed by researchers, clinicians, and individuals and families affected by Alzheimer’s disease and all other dementias,” Edelmayer said.

“This finger-prick blood testing technology for Alzheimer’s biomarkers still has to be validated more broadly, but it is very promising. Advancements in technology and practice demonstrate the simplicity, transportability, and diagnostic value of blood-based biomarkers for Alzheimer’s,” she added. 

The Alzheimer’s Association is currently conducting a systematic review of the evidence and preparing clinical practice guidelines on blood-based biomarker tests for specialized healthcare settings, with publications, clinical resources, and tools anticipated in 2025, Edelmayer noted. 

The study had no commercial funding. Huber and Edelmayer report no relevant conflicts of interest. 
 

A version of this article appeared on Medscape.com.

A finger-prick blood test can accurately identify p-tau217 — a key biomarker of Alzheimer’s disease — without the need for temperature or storage control measures.

In a pilot study, researchers found a good correlation of p-tau217 levels from blood obtained via standard venous sampling and from a single finger prick.

“We see the potential that capillary p-tau217 from dried blood spots could overcome the limitations of standard venous collection of being invasive, dependent on centrifuges and ultra-low temperature freezers, and also requiring less volume than standard plasma analysis,” said lead investigator Hanna Huber, PhD, Department of Psychiatry and Neurochemistry, Institute of Neuroscience and Physiology, University of Gothenburg, Sweden. 

The findings were presented at the 17th Clinical Trials on Alzheimer’s Disease (CTAD) conference.
 

Strong Link Between Venous and Capillary Samples 

p-tau217 has emerged as the most effective blood test to identify Alzheimer’s disease. However, traditional venous blood sampling requires certain infrastructure and immediate processing. Increased and simplified access to this blood biomarker could be crucial for early diagnosis, proper patient management, and prompt initiation of disease-modifying treatments. 

The DROP-AD project is investigating the diagnostic performance of finger-prick collection to accurately measure p-tau217. In the current study, the research team obtained paired venous blood and capillary blood samples from 206 adults (mean age, 71.8 years; 59% women), with or without cognitive impairment, from five European centers. A subset of participants provided a second finger-prick sample collected without any supervision. 

The capillary blood samples were obtained via a single finger prick, and then single blood drops were applied to a dried plasma spot (DPS) card, which was then shipped to a lab (without temperature control or cooling) for p-tau217 measurement. Cerebrospinal fluid biomarkers were available for a subset of individuals.

Throughout the entire study population, there was a “very convincing correlation” between p-tau217 levels from capillary DPS and venous plasma, Huber told conference attendees. 

Additionally, capillary DPS p-tau217 levels were able to discriminate amyloid-positive from amyloid-negative individuals, with levels of this biomarker increasing in a stepwise fashion, “from cognitively unimpaired individuals to individuals with mild cognitive impairment and, finally, to dementia patients,” Huber said.

Of note, capillary p-tau217 levels from DPS samples that were collected by research staff did not differ from unsupervised self-collected samples. 

What about the stability of the samples? Capillary DPS p-tau-217 is “stable over 2 weeks at room temperature,” Huber said. 
 

Ready for Prime Time?

Preliminary data from the DROP-AD project highlight the potential of using finger-prick blood collection to identify neurofilament light (NfL) and glial fibrillary acidic protein (GFAP), two other Alzheimer’s disease biomarkers.

“We think that capillary p-tau217, but also other biomarkers, could be a widely accessible and cheap alternative for clinical practice and clinical trials in individuals with cognitive decline if the results are confirmed in longitudinal and home-sampling cohorts,” Huber concluded. 

“Measuring biomarkers by a simple finger prick could facilitate regular and autonomous sampling at home, which would be particularly useful in remote and rural settings,” she noted. 

The findings in this study confirm and extend earlier findings that the study team reported last year at the Alzheimer’s Association International Conference (AAIC). 

“The data shared at CTAD 2024, along with the related material previously presented at AAIC 2023, reporting on a ‘finger prick’ blood test approach is interesting and emerging work but not yet ready for clinical use,” said Rebecca M. Edelmayer, PhD, Alzheimer’s Association vice president of scientific engagement.

“That said, the idea of a highly accessible and scalable tool that can aid in easier and more equitable diagnosis would be welcomed by researchers, clinicians, and individuals and families affected by Alzheimer’s disease and all other dementias,” Edelmayer said.

“This finger-prick blood testing technology for Alzheimer’s biomarkers still has to be validated more broadly, but it is very promising. Advancements in technology and practice demonstrate the simplicity, transportability, and diagnostic value of blood-based biomarkers for Alzheimer’s,” she added. 

The Alzheimer’s Association is currently conducting a systematic review of the evidence and preparing clinical practice guidelines on blood-based biomarker tests for specialized healthcare settings, with publications, clinical resources, and tools anticipated in 2025, Edelmayer noted. 

The study had no commercial funding. Huber and Edelmayer report no relevant conflicts of interest. 
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CTAD 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Preventing Pediatric Migraine

Article Type
Changed

I suspect you all have some experience with childhood migraine. It can mean a painful several hours for the patient, arriving often without warning, with recurrences spaced months or sometimes even years apart. It may be accompanied by vomiting, which in some cases overshadows the severity of the headache. It can result in lost days from school and ruin family activities. It can occur so infrequently that the family can’t recall accurately when the last episode happened. In some ways it is a different animal than the adult version.

Most of the pediatric patients with migraine I have seen have experienced attacks that were occurring so infrequently that the families and I seldom discussed medication as an option. Back then imipramine was the only choice. However, currently there are more than a half dozen medications and combinations that have been tried. Recently a review of 45 clinical trials of these medications was published in JAMA Network Open.

Dr. William G. Wilkoff

I will let you review for yourself the details of these Iranian investigators’ network meta-analysis, but the bottom line is that some medications were associated with a reduction in migraine frequency. Others were associated with headache intensity. “However, no treatments were associated with significant improvements in quality of life or reduction of the duration of migraine attacks.”

Obviously, this paper illustrates clearly that we have not yet discovered the medicinal magic bullet for pediatric migraine prophylaxis. This doesn’t surprise me. After listening to scores of families tell their migraine stories, it became apparent to me that there was often a pattern in which the child’s headache had arrived after a period of acute sleep deprivation. For example, a trip to an amusement park in which travel or excitement may have resulted in the child going to bed later and/or getting up earlier. By afternoon the child’s reserves of something (currently unknown) were depleted to a point that the headache and/or vomiting struck.

Because these episodes were often so infrequent, separated by months, that taking a history demonstrating a recurring pattern could take considerable patience on the part of the family and the provider, even for a physician like myself who believes that better sleep is the answer for everything. However, once I could convince a family of the connection between the sleep deprivation and the headaches, they could often recall other episodes in the past that substantiated my explanation.

In some cases there was no obvious history of acute sleep deprivation, or at least it was so subtle that even a history taker with a sleep obsession couldn’t detect it. However, in these cases I could usually elicit a history of chronic sleep deprivation. For example, falling asleep instantly on automobile rides, difficulty with waking in the morning, or unhealthy bedtime routines. With this underlying vulnerability of chronic sleep deprivation, a slightly more exciting or vigorous day was all that was necessary to trigger the headache.

For those of you who don’t share my contention that childhood migraine is usually the result of sleep deprivation, consider the similarity between an epileptic seizure, which can be triggered by fatigue. Both events are usually followed by a deep sleep from which the child wakes refreshed and symptom free.

I think it is interesting that this recent meta-analysis could find no benefit in the quality of life for any of the medications. The explanation may be that the child with migraine already had a somewhat diminished quality of life as a result of the sleep deprivation, either acute or chronic.

When speaking with parents of migraine sufferers, I would tell them that once the headache had started there was little I had to offer to forestall the inevitable pain and vomiting. Certainly not in the form of an oral medication. While many adults will have an aura that warns them of the headache onset, I have found that most children don’t describe an aura. It may be they simply lack the ability to express it. Occasionally an observant parent may detect pallor or a behavior change that indicates a migraine is beginning. On rare occasions a parent may be able to abort the attack by quickly getting the child to a quiet, dark, and calm environment.

Although this recent meta-analysis review of treatment options is discouraging, it may be providing a clue to effective prophylaxis. Some of the medications that decrease the frequency of the attacks may be doing so because they improve the patient’s sleep patterns. Those that decrease the intensity of the pain are probably working on pain pathway that is not specific to migraine.

Continuing a search for a prophylactic medication is a worthy goal, particularly for those patients in which their migraines are debilitating. However, based on my experience, enhanced by my bias, the safest and most effective prophylaxis results from increasing the family’s awareness of the role that sleep deprivation plays in the illness. Even when the family buys into the message and attempts to avoid situations that will tax their vulnerable children, parents will need to accept that sometimes stuff happens even though siblings and peers may be able to tolerate the situation. Spontaneous activities can converge on a day when for whatever reason the migraine-prone child is overtired and the headache and vomiting will erupt.

A lifestyle change is always preferable to a pharmacological intervention. However, that doesn’t mean it is always easy to achieve.

Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “How to Say No to Your Toddler.” Other than a Littman stethoscope he accepted as a first-year medical student in 1966, Dr. Wilkoff reports having nothing to disclose. Email him at [email protected].

Publications
Topics
Sections

I suspect you all have some experience with childhood migraine. It can mean a painful several hours for the patient, arriving often without warning, with recurrences spaced months or sometimes even years apart. It may be accompanied by vomiting, which in some cases overshadows the severity of the headache. It can result in lost days from school and ruin family activities. It can occur so infrequently that the family can’t recall accurately when the last episode happened. In some ways it is a different animal than the adult version.

Most of the pediatric patients with migraine I have seen have experienced attacks that were occurring so infrequently that the families and I seldom discussed medication as an option. Back then imipramine was the only choice. However, currently there are more than a half dozen medications and combinations that have been tried. Recently a review of 45 clinical trials of these medications was published in JAMA Network Open.

Dr. William G. Wilkoff

I will let you review for yourself the details of these Iranian investigators’ network meta-analysis, but the bottom line is that some medications were associated with a reduction in migraine frequency. Others were associated with headache intensity. “However, no treatments were associated with significant improvements in quality of life or reduction of the duration of migraine attacks.”

Obviously, this paper illustrates clearly that we have not yet discovered the medicinal magic bullet for pediatric migraine prophylaxis. This doesn’t surprise me. After listening to scores of families tell their migraine stories, it became apparent to me that there was often a pattern in which the child’s headache had arrived after a period of acute sleep deprivation. For example, a trip to an amusement park in which travel or excitement may have resulted in the child going to bed later and/or getting up earlier. By afternoon the child’s reserves of something (currently unknown) were depleted to a point that the headache and/or vomiting struck.

Because these episodes were often so infrequent, separated by months, that taking a history demonstrating a recurring pattern could take considerable patience on the part of the family and the provider, even for a physician like myself who believes that better sleep is the answer for everything. However, once I could convince a family of the connection between the sleep deprivation and the headaches, they could often recall other episodes in the past that substantiated my explanation.

In some cases there was no obvious history of acute sleep deprivation, or at least it was so subtle that even a history taker with a sleep obsession couldn’t detect it. However, in these cases I could usually elicit a history of chronic sleep deprivation. For example, falling asleep instantly on automobile rides, difficulty with waking in the morning, or unhealthy bedtime routines. With this underlying vulnerability of chronic sleep deprivation, a slightly more exciting or vigorous day was all that was necessary to trigger the headache.

For those of you who don’t share my contention that childhood migraine is usually the result of sleep deprivation, consider the similarity between an epileptic seizure, which can be triggered by fatigue. Both events are usually followed by a deep sleep from which the child wakes refreshed and symptom free.

I think it is interesting that this recent meta-analysis could find no benefit in the quality of life for any of the medications. The explanation may be that the child with migraine already had a somewhat diminished quality of life as a result of the sleep deprivation, either acute or chronic.

When speaking with parents of migraine sufferers, I would tell them that once the headache had started there was little I had to offer to forestall the inevitable pain and vomiting. Certainly not in the form of an oral medication. While many adults will have an aura that warns them of the headache onset, I have found that most children don’t describe an aura. It may be they simply lack the ability to express it. Occasionally an observant parent may detect pallor or a behavior change that indicates a migraine is beginning. On rare occasions a parent may be able to abort the attack by quickly getting the child to a quiet, dark, and calm environment.

Although this recent meta-analysis review of treatment options is discouraging, it may be providing a clue to effective prophylaxis. Some of the medications that decrease the frequency of the attacks may be doing so because they improve the patient’s sleep patterns. Those that decrease the intensity of the pain are probably working on pain pathway that is not specific to migraine.

Continuing a search for a prophylactic medication is a worthy goal, particularly for those patients in which their migraines are debilitating. However, based on my experience, enhanced by my bias, the safest and most effective prophylaxis results from increasing the family’s awareness of the role that sleep deprivation plays in the illness. Even when the family buys into the message and attempts to avoid situations that will tax their vulnerable children, parents will need to accept that sometimes stuff happens even though siblings and peers may be able to tolerate the situation. Spontaneous activities can converge on a day when for whatever reason the migraine-prone child is overtired and the headache and vomiting will erupt.

A lifestyle change is always preferable to a pharmacological intervention. However, that doesn’t mean it is always easy to achieve.

Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “How to Say No to Your Toddler.” Other than a Littman stethoscope he accepted as a first-year medical student in 1966, Dr. Wilkoff reports having nothing to disclose. Email him at [email protected].

I suspect you all have some experience with childhood migraine. It can mean a painful several hours for the patient, arriving often without warning, with recurrences spaced months or sometimes even years apart. It may be accompanied by vomiting, which in some cases overshadows the severity of the headache. It can result in lost days from school and ruin family activities. It can occur so infrequently that the family can’t recall accurately when the last episode happened. In some ways it is a different animal than the adult version.

Most of the pediatric patients with migraine I have seen have experienced attacks that were occurring so infrequently that the families and I seldom discussed medication as an option. Back then imipramine was the only choice. However, currently there are more than a half dozen medications and combinations that have been tried. Recently a review of 45 clinical trials of these medications was published in JAMA Network Open.

Dr. William G. Wilkoff

I will let you review for yourself the details of these Iranian investigators’ network meta-analysis, but the bottom line is that some medications were associated with a reduction in migraine frequency. Others were associated with headache intensity. “However, no treatments were associated with significant improvements in quality of life or reduction of the duration of migraine attacks.”

Obviously, this paper illustrates clearly that we have not yet discovered the medicinal magic bullet for pediatric migraine prophylaxis. This doesn’t surprise me. After listening to scores of families tell their migraine stories, it became apparent to me that there was often a pattern in which the child’s headache had arrived after a period of acute sleep deprivation. For example, a trip to an amusement park in which travel or excitement may have resulted in the child going to bed later and/or getting up earlier. By afternoon the child’s reserves of something (currently unknown) were depleted to a point that the headache and/or vomiting struck.

Because these episodes were often so infrequent, separated by months, that taking a history demonstrating a recurring pattern could take considerable patience on the part of the family and the provider, even for a physician like myself who believes that better sleep is the answer for everything. However, once I could convince a family of the connection between the sleep deprivation and the headaches, they could often recall other episodes in the past that substantiated my explanation.

In some cases there was no obvious history of acute sleep deprivation, or at least it was so subtle that even a history taker with a sleep obsession couldn’t detect it. However, in these cases I could usually elicit a history of chronic sleep deprivation. For example, falling asleep instantly on automobile rides, difficulty with waking in the morning, or unhealthy bedtime routines. With this underlying vulnerability of chronic sleep deprivation, a slightly more exciting or vigorous day was all that was necessary to trigger the headache.

For those of you who don’t share my contention that childhood migraine is usually the result of sleep deprivation, consider the similarity between an epileptic seizure, which can be triggered by fatigue. Both events are usually followed by a deep sleep from which the child wakes refreshed and symptom free.

I think it is interesting that this recent meta-analysis could find no benefit in the quality of life for any of the medications. The explanation may be that the child with migraine already had a somewhat diminished quality of life as a result of the sleep deprivation, either acute or chronic.

When speaking with parents of migraine sufferers, I would tell them that once the headache had started there was little I had to offer to forestall the inevitable pain and vomiting. Certainly not in the form of an oral medication. While many adults will have an aura that warns them of the headache onset, I have found that most children don’t describe an aura. It may be they simply lack the ability to express it. Occasionally an observant parent may detect pallor or a behavior change that indicates a migraine is beginning. On rare occasions a parent may be able to abort the attack by quickly getting the child to a quiet, dark, and calm environment.

Although this recent meta-analysis review of treatment options is discouraging, it may be providing a clue to effective prophylaxis. Some of the medications that decrease the frequency of the attacks may be doing so because they improve the patient’s sleep patterns. Those that decrease the intensity of the pain are probably working on pain pathway that is not specific to migraine.

Continuing a search for a prophylactic medication is a worthy goal, particularly for those patients in which their migraines are debilitating. However, based on my experience, enhanced by my bias, the safest and most effective prophylaxis results from increasing the family’s awareness of the role that sleep deprivation plays in the illness. Even when the family buys into the message and attempts to avoid situations that will tax their vulnerable children, parents will need to accept that sometimes stuff happens even though siblings and peers may be able to tolerate the situation. Spontaneous activities can converge on a day when for whatever reason the migraine-prone child is overtired and the headache and vomiting will erupt.

A lifestyle change is always preferable to a pharmacological intervention. However, that doesn’t mean it is always easy to achieve.

Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics, including “How to Say No to Your Toddler.” Other than a Littman stethoscope he accepted as a first-year medical student in 1966, Dr. Wilkoff reports having nothing to disclose. Email him at [email protected].

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Rising Stroke Rates in Californians With Sickle Cell Disease

Article Type
Changed

 

TOPLINE:

Stroke rates in Californians with sickle cell disease (SCD) have increased in both children and adults in the post-STOP era. The cumulative incidence of first ischemic stroke was 2.1% by age 20 and 13.5% by age 60.

METHODOLOGY:

  • Researchers analyzed data from the California Department of Health Care Access and Innovation (HCAI), covering emergency department and hospitalization records from 1991 to 2019.
  • A total of 7636 patients with SCD were included in the study cohort.
  • Cumulative incidence and rates for primary and recurrent strokes and transient ischemic attacks (TIAs) were determined pre- and post STOP trial.
  • Patients with SCD were identified using ICD-9 and ICD-10 codes, with specific criteria for inclusion based on hospitalization records.
  • The study utilized Fine and Gray methodology to calculate cumulative incidence functions, accounting for the competing risk for death.

TAKEAWAY:

  • The cumulative incidence of first ischemic stroke in patients with SCD was 2.1% by age 20 and 13.5% by age 60.
  • Ischemic stroke rates increased significantly in children and adults in the 2010-2019 period, compared with the preceding decade.
  • Risk factors for stroke and TIA included increasing age, hypertension, and hyperlipidemia.
  • The study found a significant increase in rates of intracranial hemorrhage in adults aged 18-30 years and TIAs in children younger than 18 years from 2010 to 2019, compared with the prior decade.

IN PRACTICE:

“Neurovascular complications, including strokes and transient ischemic attacks (TIAs), are common and cause significant morbidity in individuals with sickle cell disease (SCD). The STOP trial (1998) established chronic transfusions as the standard of care for children with SCD at high risk for stroke,” the study’s authors wrote.

SOURCE:

This study was led by Olubusola B. Oluwole, MD, MS, University of Pittsburgh in Pennsylvania, and was published online in Blood.

LIMITATIONS:

This study’s reliance on administrative data may have introduced systematic errors, particularly with the transition from ICD-9 to ICD-10 codes. The lack of laboratory results and medication data in the HCAI database limited the ability to fully assess patient conditions and treatments. Additionally, the methodology changes in 2014 likely underreported death rates in people without PDD/EDU encounters in the calendar year preceding their death.

DISCLOSURES:

The authors reported no relevant conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Stroke rates in Californians with sickle cell disease (SCD) have increased in both children and adults in the post-STOP era. The cumulative incidence of first ischemic stroke was 2.1% by age 20 and 13.5% by age 60.

METHODOLOGY:

  • Researchers analyzed data from the California Department of Health Care Access and Innovation (HCAI), covering emergency department and hospitalization records from 1991 to 2019.
  • A total of 7636 patients with SCD were included in the study cohort.
  • Cumulative incidence and rates for primary and recurrent strokes and transient ischemic attacks (TIAs) were determined pre- and post STOP trial.
  • Patients with SCD were identified using ICD-9 and ICD-10 codes, with specific criteria for inclusion based on hospitalization records.
  • The study utilized Fine and Gray methodology to calculate cumulative incidence functions, accounting for the competing risk for death.

TAKEAWAY:

  • The cumulative incidence of first ischemic stroke in patients with SCD was 2.1% by age 20 and 13.5% by age 60.
  • Ischemic stroke rates increased significantly in children and adults in the 2010-2019 period, compared with the preceding decade.
  • Risk factors for stroke and TIA included increasing age, hypertension, and hyperlipidemia.
  • The study found a significant increase in rates of intracranial hemorrhage in adults aged 18-30 years and TIAs in children younger than 18 years from 2010 to 2019, compared with the prior decade.

IN PRACTICE:

“Neurovascular complications, including strokes and transient ischemic attacks (TIAs), are common and cause significant morbidity in individuals with sickle cell disease (SCD). The STOP trial (1998) established chronic transfusions as the standard of care for children with SCD at high risk for stroke,” the study’s authors wrote.

SOURCE:

This study was led by Olubusola B. Oluwole, MD, MS, University of Pittsburgh in Pennsylvania, and was published online in Blood.

LIMITATIONS:

This study’s reliance on administrative data may have introduced systematic errors, particularly with the transition from ICD-9 to ICD-10 codes. The lack of laboratory results and medication data in the HCAI database limited the ability to fully assess patient conditions and treatments. Additionally, the methodology changes in 2014 likely underreported death rates in people without PDD/EDU encounters in the calendar year preceding their death.

DISCLOSURES:

The authors reported no relevant conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

 

TOPLINE:

Stroke rates in Californians with sickle cell disease (SCD) have increased in both children and adults in the post-STOP era. The cumulative incidence of first ischemic stroke was 2.1% by age 20 and 13.5% by age 60.

METHODOLOGY:

  • Researchers analyzed data from the California Department of Health Care Access and Innovation (HCAI), covering emergency department and hospitalization records from 1991 to 2019.
  • A total of 7636 patients with SCD were included in the study cohort.
  • Cumulative incidence and rates for primary and recurrent strokes and transient ischemic attacks (TIAs) were determined pre- and post STOP trial.
  • Patients with SCD were identified using ICD-9 and ICD-10 codes, with specific criteria for inclusion based on hospitalization records.
  • The study utilized Fine and Gray methodology to calculate cumulative incidence functions, accounting for the competing risk for death.

TAKEAWAY:

  • The cumulative incidence of first ischemic stroke in patients with SCD was 2.1% by age 20 and 13.5% by age 60.
  • Ischemic stroke rates increased significantly in children and adults in the 2010-2019 period, compared with the preceding decade.
  • Risk factors for stroke and TIA included increasing age, hypertension, and hyperlipidemia.
  • The study found a significant increase in rates of intracranial hemorrhage in adults aged 18-30 years and TIAs in children younger than 18 years from 2010 to 2019, compared with the prior decade.

IN PRACTICE:

“Neurovascular complications, including strokes and transient ischemic attacks (TIAs), are common and cause significant morbidity in individuals with sickle cell disease (SCD). The STOP trial (1998) established chronic transfusions as the standard of care for children with SCD at high risk for stroke,” the study’s authors wrote.

SOURCE:

This study was led by Olubusola B. Oluwole, MD, MS, University of Pittsburgh in Pennsylvania, and was published online in Blood.

LIMITATIONS:

This study’s reliance on administrative data may have introduced systematic errors, particularly with the transition from ICD-9 to ICD-10 codes. The lack of laboratory results and medication data in the HCAI database limited the ability to fully assess patient conditions and treatments. Additionally, the methodology changes in 2014 likely underreported death rates in people without PDD/EDU encounters in the calendar year preceding their death.

DISCLOSURES:

The authors reported no relevant conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

New Data on DOAC Initiation After Stroke in AF: Final Word?

Article Type
Changed

— The long-standing debate as to when to start anticoagulation in patients with an acute ischemic stroke and atrial fibrillation (AF) looks as though it’s settled.

Results of the OPTIMAS trial, the largest trial to address this question, showed that initiation of a direct oral anticoagulant (DOAC) within 4 days after ischemic stroke associated with AF was noninferior to delayed initiation (7-14 days) for the composite outcome of ischemic stroke, intracranial hemorrhage, unclassifiable stroke, or systemic embolism at 90 days. Importantly, early DOAC initiation was safe with a low rate of symptomatic hemorrhage, regardless of stroke severity.

In addition, a new meta-analysis, known as CATALYST, which included all four randomized trials now available on this issue, showed a clear benefit of earlier initiation (within 4 days) versus later (5 days and up) on its primary endpoint of new ischemic stroke, symptomatic intracerebral hemorrhage, and unclassified stroke at 30 days.

The results of the OPTIMAS trial and the meta-analysis were both presented at the 16th World Stroke Congress (WSC) 2024. The OPTIMAS trial was also simultaneously published online in The Lancet.

“Our findings do not support the guideline recommended practice of delaying DOAC initiation after ischemic stroke with AF regardless of clinical stroke severity, reperfusion or prior anticoagulation,” said OPTIMAS investigator David Werring, PhD, University College London in England.

Presenting the meta-analysis, Signild Åsberg, MD, Uppsala University, Uppsala, Sweden, said his group’s findings “support the early start of DOACs (within 4 days) in clinical practice.”

Werring pointed out that starting anticoagulation early also had important logistical advantages.

“This means we can start anticoagulation before patients are discharged from hospital, thus ensuring that this important secondary prevention medication is always prescribed, when appropriate. That’s going to be a key benefit in the real world.”
 

Clinical Dilemma

Werring noted that AF accounts for 20%-30% of ischemic strokes, which tend to be more severe than other stroke types. The pivotal trials of DOACs did not include patients within 30 days of an acute ischemic stroke, creating a clinical dilemma on when to start this treatment.

“On the one hand, we wish to start anticoagulation early to reduce early recurrence of ischemic stroke. But on the other hand, there are concerns that if we start anticoagulation early, it could cause intracranial bleeding, including hemorrhagic transformation of the acute infarct. Guidelines on this issue are inconsistent and have called for randomized control trials in this area,” he noted.

So far, three randomized trials on DOAC timing have been conducted, which Werring said suggested early DOAC treatment is safe. However, these trials have provided limited data on moderate to severe stroke, patients with hemorrhagic transformation, or those already taking oral anticoagulants — subgroups in which there are particular concerns about early oral anticoagulation.

The OPTIMAS trial included a broad population of patients with acute ischemic stroke associated with AF including these critical subgroups.

The trial, conducted at 100 hospitals in the United Kingdom, included 3648 patients with AF and acute ischemic stroke who were randomly assigned to early (≤ 4 days from stroke symptom onset) or delayed (7-14 days) anticoagulation initiation with any DOAC.

There was no restriction on stroke severity, and patients with hemorrhagic transformation were allowed, with the exception of parenchymal hematoma type 2, a rare and severe type of hemorrhagic transformation.

Approximately 35% of patients had been taking an oral anticoagulant, mainly DOACs, prior to their stroke, and about 30% had revascularization with thrombolysis, thrombectomy, or both. Nearly 900 participants (25%) had moderate to severe stroke (National Institutes of Health Stroke Scale [NIHSS] score ≥ 11).

The primary outcome was a composite of recurrent ischemic stroke, symptomatic intracranial hemorrhage, unclassifiable stroke, or systemic embolism incidence at 90 days. The initial analysis aimed to show noninferiority of early DOAC initiation, with a noninferiority margin of 2 percentage points, followed by testing for superiority.

Results showed that the primary outcome occurred in 3.3% of both groups (adjusted risk difference, 0.000; 95% CI, −0.011 to 0.012), with noninferiority criteria fulfilled. Superiority was not achieved.

Symptomatic intracranial hemorrhage occurred in 0.6% of patients in the early DOAC initiation group vs 0.7% of those in the delayed group — a nonsignificant difference.
 

 

 

Applicable to Real-World Practice

A time-to-event analysis of the primary outcome showed that there were fewer outcomes in the first 30 days in the early DOAC initiation group, but the curves subsequently came together.

Subgroup analysis showed consistent results across all whole trial population, with no modification of the effect of early DOAC initiation according to stroke severity, reperfusion treatment, or previous anticoagulation.

Werring said that strengths of the OPTIMAS trial included a large sample size, a broad population with generalizability to real-world practice, and the inclusion of patients at higher bleeding risk than included in previous studies.

During the discussion, it was noted that the trial included few (about 3%) patients — about 3% — with very severe stroke (NIHSS score > 21), with the question of whether the findings could be applied to this group.

Werring noted that there was no evidence of heterogeneity, and if anything, patients with more severe strokes may have had a slightly greater benefit with early DOAC initiation. “So my feeling is probably these results do generalize to the more severe patients,” he said.

In a commentary accompanying The Lancet publication of the OPTIMAS trial, Else Charlotte Sandset, MD, University of Oslo, in Norway, and Diana Aguiar de Sousa, MD, Central Lisbon University Hospital Centre, Lisbon, Portugal, noted that the “increasing body of evidence strongly supports the message that initiating anticoagulation early for patients with ischaemic stroke is safe. The consistent absence of heterogeneity in safety outcomes suggests that the risk of symptomatic intracranial haemorrhage is not a major concern, even in patients with large infarcts.”

Regardless of the size of the treatment effect, initiating early anticoagulation makes sense when it can be done safely, as it helps prevent recurrent ischemic strokes and other embolic events. Early intervention reduces embolization risk, particularly in high-risk patients, and allows secondary prevention measures to begin while patients are still hospitalized, they added.
 

CATALYST Findings

The CATALYST meta-analysis included four trials, namely, TIMING, ELAN, OPTIMAS, and START, of early versus later DOAC administration in a total of 5411 patients with acute ischemic stroke and AF. In this meta-analysis, early was defined as within 4 days of stroke and later as 5 days or more.

The primary outcome was a composite of ischemic stroke, symptomatic, intracerebral hemorrhage, or unclassified stroke at 30 days. This was significantly reduced in the early group (2.12%) versus 3.02% in the later group, giving an odds ratio of 0.70 (95% CI, 0.50-0.98; P =.04).

The results were consistent across all subgroups, all suggesting an advantage for early DOAC.

Further analysis showed a clear benefit of early DOAC initiation in ischemic stroke with the curves separating early.

The rate of symptomatic intracerebral hemorrhage was low in both groups (0.45% in the early group and 0.40% in the later group) as was extracranial hemorrhage (0.45% vs 0.55%).

At 90 days, there were still lower event rates in the early group than the later one, but the difference was no longer statistically significant.
 

‘Practice Changing’ Results

Commenting on both studies, chair of the WSC session where the results of both OPTIMAS trial and the meta-analysis were presented, Craig Anderson, MD, The George Institute for Global Health, Sydney, Australia, described these latest results as “practice changing.”

“When to start anticoagulation in acute ischemic stroke patients with AF has been uncertain for a long time. The dogma has always been that we should wait. Over the years, we’ve become a little bit more confident, but now we’ve got good data from randomized trials showing that early initiation is safe, with the meta-analysis showing benefit,” he said.

“These new data from OPTIMAS will reassure clinicians that there’s no excessive harm and, more importantly, no excessive harm across all patient groups. And the meta-analysis clearly showed an upfront benefit of starting anticoagulation early. That’s a very convincing result,” he added.

Anderson cautioned that there still may be concerns about starting DOACs early in some groups, including Asian populations that have a higher bleeding risk (these trials included predominantly White patients) and people who are older or frail, who may have extensive small vessel disease.

During the discussion, several questions centered on the lack of imaging data available on the patients in the studies. Anderson said imaging data would help reassure clinicians on the safety of early anticoagulation in patients with large infarcts.

“Stroke clinicians make decisions on the basis of the patient and on the basis of the brain, and we only have the patient information at the moment. We don’t have information on the brain — that comes from imaging.”

Regardless, he believes these new data will lead to a shift in practice. “But maybe, it won’t be as dramatic as we would hope because I think some clinicians may still hesitate to apply these results to patients at high risk of bleeding. With imaging data from the studies that might change.”

The OPTIMAS trial was funded by University College London and the British Heart Foundation. Werring reported consulting fees from Novo Nordisk, National Institute for Health and Care Excellence, and Alnylam; payments or speaker honoraria from Novo Nordisk, Bayer, and AstraZeneca/Alexion; participation on a data safety monitoring board for the OXHARP trial; and participation as steering committee chair for the MACE-ICH and PLINTH trials. Åsberg received institutional research grants and lecture fees to her institution from AstraZeneca, Boehringer Ingelheim, Bristol Myers Squibb, and Institut Produits Synthése. Sandset and de Sousa were both steering committee members of the ELAN trial. Anderson reported grant funding from Penumbra and Takeda China.
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

— The long-standing debate as to when to start anticoagulation in patients with an acute ischemic stroke and atrial fibrillation (AF) looks as though it’s settled.

Results of the OPTIMAS trial, the largest trial to address this question, showed that initiation of a direct oral anticoagulant (DOAC) within 4 days after ischemic stroke associated with AF was noninferior to delayed initiation (7-14 days) for the composite outcome of ischemic stroke, intracranial hemorrhage, unclassifiable stroke, or systemic embolism at 90 days. Importantly, early DOAC initiation was safe with a low rate of symptomatic hemorrhage, regardless of stroke severity.

In addition, a new meta-analysis, known as CATALYST, which included all four randomized trials now available on this issue, showed a clear benefit of earlier initiation (within 4 days) versus later (5 days and up) on its primary endpoint of new ischemic stroke, symptomatic intracerebral hemorrhage, and unclassified stroke at 30 days.

The results of the OPTIMAS trial and the meta-analysis were both presented at the 16th World Stroke Congress (WSC) 2024. The OPTIMAS trial was also simultaneously published online in The Lancet.

“Our findings do not support the guideline recommended practice of delaying DOAC initiation after ischemic stroke with AF regardless of clinical stroke severity, reperfusion or prior anticoagulation,” said OPTIMAS investigator David Werring, PhD, University College London in England.

Presenting the meta-analysis, Signild Åsberg, MD, Uppsala University, Uppsala, Sweden, said his group’s findings “support the early start of DOACs (within 4 days) in clinical practice.”

Werring pointed out that starting anticoagulation early also had important logistical advantages.

“This means we can start anticoagulation before patients are discharged from hospital, thus ensuring that this important secondary prevention medication is always prescribed, when appropriate. That’s going to be a key benefit in the real world.”
 

Clinical Dilemma

Werring noted that AF accounts for 20%-30% of ischemic strokes, which tend to be more severe than other stroke types. The pivotal trials of DOACs did not include patients within 30 days of an acute ischemic stroke, creating a clinical dilemma on when to start this treatment.

“On the one hand, we wish to start anticoagulation early to reduce early recurrence of ischemic stroke. But on the other hand, there are concerns that if we start anticoagulation early, it could cause intracranial bleeding, including hemorrhagic transformation of the acute infarct. Guidelines on this issue are inconsistent and have called for randomized control trials in this area,” he noted.

So far, three randomized trials on DOAC timing have been conducted, which Werring said suggested early DOAC treatment is safe. However, these trials have provided limited data on moderate to severe stroke, patients with hemorrhagic transformation, or those already taking oral anticoagulants — subgroups in which there are particular concerns about early oral anticoagulation.

The OPTIMAS trial included a broad population of patients with acute ischemic stroke associated with AF including these critical subgroups.

The trial, conducted at 100 hospitals in the United Kingdom, included 3648 patients with AF and acute ischemic stroke who were randomly assigned to early (≤ 4 days from stroke symptom onset) or delayed (7-14 days) anticoagulation initiation with any DOAC.

There was no restriction on stroke severity, and patients with hemorrhagic transformation were allowed, with the exception of parenchymal hematoma type 2, a rare and severe type of hemorrhagic transformation.

Approximately 35% of patients had been taking an oral anticoagulant, mainly DOACs, prior to their stroke, and about 30% had revascularization with thrombolysis, thrombectomy, or both. Nearly 900 participants (25%) had moderate to severe stroke (National Institutes of Health Stroke Scale [NIHSS] score ≥ 11).

The primary outcome was a composite of recurrent ischemic stroke, symptomatic intracranial hemorrhage, unclassifiable stroke, or systemic embolism incidence at 90 days. The initial analysis aimed to show noninferiority of early DOAC initiation, with a noninferiority margin of 2 percentage points, followed by testing for superiority.

Results showed that the primary outcome occurred in 3.3% of both groups (adjusted risk difference, 0.000; 95% CI, −0.011 to 0.012), with noninferiority criteria fulfilled. Superiority was not achieved.

Symptomatic intracranial hemorrhage occurred in 0.6% of patients in the early DOAC initiation group vs 0.7% of those in the delayed group — a nonsignificant difference.
 

 

 

Applicable to Real-World Practice

A time-to-event analysis of the primary outcome showed that there were fewer outcomes in the first 30 days in the early DOAC initiation group, but the curves subsequently came together.

Subgroup analysis showed consistent results across all whole trial population, with no modification of the effect of early DOAC initiation according to stroke severity, reperfusion treatment, or previous anticoagulation.

Werring said that strengths of the OPTIMAS trial included a large sample size, a broad population with generalizability to real-world practice, and the inclusion of patients at higher bleeding risk than included in previous studies.

During the discussion, it was noted that the trial included few (about 3%) patients — about 3% — with very severe stroke (NIHSS score > 21), with the question of whether the findings could be applied to this group.

Werring noted that there was no evidence of heterogeneity, and if anything, patients with more severe strokes may have had a slightly greater benefit with early DOAC initiation. “So my feeling is probably these results do generalize to the more severe patients,” he said.

In a commentary accompanying The Lancet publication of the OPTIMAS trial, Else Charlotte Sandset, MD, University of Oslo, in Norway, and Diana Aguiar de Sousa, MD, Central Lisbon University Hospital Centre, Lisbon, Portugal, noted that the “increasing body of evidence strongly supports the message that initiating anticoagulation early for patients with ischaemic stroke is safe. The consistent absence of heterogeneity in safety outcomes suggests that the risk of symptomatic intracranial haemorrhage is not a major concern, even in patients with large infarcts.”

Regardless of the size of the treatment effect, initiating early anticoagulation makes sense when it can be done safely, as it helps prevent recurrent ischemic strokes and other embolic events. Early intervention reduces embolization risk, particularly in high-risk patients, and allows secondary prevention measures to begin while patients are still hospitalized, they added.
 

CATALYST Findings

The CATALYST meta-analysis included four trials, namely, TIMING, ELAN, OPTIMAS, and START, of early versus later DOAC administration in a total of 5411 patients with acute ischemic stroke and AF. In this meta-analysis, early was defined as within 4 days of stroke and later as 5 days or more.

The primary outcome was a composite of ischemic stroke, symptomatic, intracerebral hemorrhage, or unclassified stroke at 30 days. This was significantly reduced in the early group (2.12%) versus 3.02% in the later group, giving an odds ratio of 0.70 (95% CI, 0.50-0.98; P =.04).

The results were consistent across all subgroups, all suggesting an advantage for early DOAC.

Further analysis showed a clear benefit of early DOAC initiation in ischemic stroke with the curves separating early.

The rate of symptomatic intracerebral hemorrhage was low in both groups (0.45% in the early group and 0.40% in the later group) as was extracranial hemorrhage (0.45% vs 0.55%).

At 90 days, there were still lower event rates in the early group than the later one, but the difference was no longer statistically significant.
 

‘Practice Changing’ Results

Commenting on both studies, chair of the WSC session where the results of both OPTIMAS trial and the meta-analysis were presented, Craig Anderson, MD, The George Institute for Global Health, Sydney, Australia, described these latest results as “practice changing.”

“When to start anticoagulation in acute ischemic stroke patients with AF has been uncertain for a long time. The dogma has always been that we should wait. Over the years, we’ve become a little bit more confident, but now we’ve got good data from randomized trials showing that early initiation is safe, with the meta-analysis showing benefit,” he said.

“These new data from OPTIMAS will reassure clinicians that there’s no excessive harm and, more importantly, no excessive harm across all patient groups. And the meta-analysis clearly showed an upfront benefit of starting anticoagulation early. That’s a very convincing result,” he added.

Anderson cautioned that there still may be concerns about starting DOACs early in some groups, including Asian populations that have a higher bleeding risk (these trials included predominantly White patients) and people who are older or frail, who may have extensive small vessel disease.

During the discussion, several questions centered on the lack of imaging data available on the patients in the studies. Anderson said imaging data would help reassure clinicians on the safety of early anticoagulation in patients with large infarcts.

“Stroke clinicians make decisions on the basis of the patient and on the basis of the brain, and we only have the patient information at the moment. We don’t have information on the brain — that comes from imaging.”

Regardless, he believes these new data will lead to a shift in practice. “But maybe, it won’t be as dramatic as we would hope because I think some clinicians may still hesitate to apply these results to patients at high risk of bleeding. With imaging data from the studies that might change.”

The OPTIMAS trial was funded by University College London and the British Heart Foundation. Werring reported consulting fees from Novo Nordisk, National Institute for Health and Care Excellence, and Alnylam; payments or speaker honoraria from Novo Nordisk, Bayer, and AstraZeneca/Alexion; participation on a data safety monitoring board for the OXHARP trial; and participation as steering committee chair for the MACE-ICH and PLINTH trials. Åsberg received institutional research grants and lecture fees to her institution from AstraZeneca, Boehringer Ingelheim, Bristol Myers Squibb, and Institut Produits Synthése. Sandset and de Sousa were both steering committee members of the ELAN trial. Anderson reported grant funding from Penumbra and Takeda China.
 

A version of this article appeared on Medscape.com.

— The long-standing debate as to when to start anticoagulation in patients with an acute ischemic stroke and atrial fibrillation (AF) looks as though it’s settled.

Results of the OPTIMAS trial, the largest trial to address this question, showed that initiation of a direct oral anticoagulant (DOAC) within 4 days after ischemic stroke associated with AF was noninferior to delayed initiation (7-14 days) for the composite outcome of ischemic stroke, intracranial hemorrhage, unclassifiable stroke, or systemic embolism at 90 days. Importantly, early DOAC initiation was safe with a low rate of symptomatic hemorrhage, regardless of stroke severity.

In addition, a new meta-analysis, known as CATALYST, which included all four randomized trials now available on this issue, showed a clear benefit of earlier initiation (within 4 days) versus later (5 days and up) on its primary endpoint of new ischemic stroke, symptomatic intracerebral hemorrhage, and unclassified stroke at 30 days.

The results of the OPTIMAS trial and the meta-analysis were both presented at the 16th World Stroke Congress (WSC) 2024. The OPTIMAS trial was also simultaneously published online in The Lancet.

“Our findings do not support the guideline recommended practice of delaying DOAC initiation after ischemic stroke with AF regardless of clinical stroke severity, reperfusion or prior anticoagulation,” said OPTIMAS investigator David Werring, PhD, University College London in England.

Presenting the meta-analysis, Signild Åsberg, MD, Uppsala University, Uppsala, Sweden, said his group’s findings “support the early start of DOACs (within 4 days) in clinical practice.”

Werring pointed out that starting anticoagulation early also had important logistical advantages.

“This means we can start anticoagulation before patients are discharged from hospital, thus ensuring that this important secondary prevention medication is always prescribed, when appropriate. That’s going to be a key benefit in the real world.”
 

Clinical Dilemma

Werring noted that AF accounts for 20%-30% of ischemic strokes, which tend to be more severe than other stroke types. The pivotal trials of DOACs did not include patients within 30 days of an acute ischemic stroke, creating a clinical dilemma on when to start this treatment.

“On the one hand, we wish to start anticoagulation early to reduce early recurrence of ischemic stroke. But on the other hand, there are concerns that if we start anticoagulation early, it could cause intracranial bleeding, including hemorrhagic transformation of the acute infarct. Guidelines on this issue are inconsistent and have called for randomized control trials in this area,” he noted.

So far, three randomized trials on DOAC timing have been conducted, which Werring said suggested early DOAC treatment is safe. However, these trials have provided limited data on moderate to severe stroke, patients with hemorrhagic transformation, or those already taking oral anticoagulants — subgroups in which there are particular concerns about early oral anticoagulation.

The OPTIMAS trial included a broad population of patients with acute ischemic stroke associated with AF including these critical subgroups.

The trial, conducted at 100 hospitals in the United Kingdom, included 3648 patients with AF and acute ischemic stroke who were randomly assigned to early (≤ 4 days from stroke symptom onset) or delayed (7-14 days) anticoagulation initiation with any DOAC.

There was no restriction on stroke severity, and patients with hemorrhagic transformation were allowed, with the exception of parenchymal hematoma type 2, a rare and severe type of hemorrhagic transformation.

Approximately 35% of patients had been taking an oral anticoagulant, mainly DOACs, prior to their stroke, and about 30% had revascularization with thrombolysis, thrombectomy, or both. Nearly 900 participants (25%) had moderate to severe stroke (National Institutes of Health Stroke Scale [NIHSS] score ≥ 11).

The primary outcome was a composite of recurrent ischemic stroke, symptomatic intracranial hemorrhage, unclassifiable stroke, or systemic embolism incidence at 90 days. The initial analysis aimed to show noninferiority of early DOAC initiation, with a noninferiority margin of 2 percentage points, followed by testing for superiority.

Results showed that the primary outcome occurred in 3.3% of both groups (adjusted risk difference, 0.000; 95% CI, −0.011 to 0.012), with noninferiority criteria fulfilled. Superiority was not achieved.

Symptomatic intracranial hemorrhage occurred in 0.6% of patients in the early DOAC initiation group vs 0.7% of those in the delayed group — a nonsignificant difference.
 

 

 

Applicable to Real-World Practice

A time-to-event analysis of the primary outcome showed that there were fewer outcomes in the first 30 days in the early DOAC initiation group, but the curves subsequently came together.

Subgroup analysis showed consistent results across all whole trial population, with no modification of the effect of early DOAC initiation according to stroke severity, reperfusion treatment, or previous anticoagulation.

Werring said that strengths of the OPTIMAS trial included a large sample size, a broad population with generalizability to real-world practice, and the inclusion of patients at higher bleeding risk than included in previous studies.

During the discussion, it was noted that the trial included few (about 3%) patients — about 3% — with very severe stroke (NIHSS score > 21), with the question of whether the findings could be applied to this group.

Werring noted that there was no evidence of heterogeneity, and if anything, patients with more severe strokes may have had a slightly greater benefit with early DOAC initiation. “So my feeling is probably these results do generalize to the more severe patients,” he said.

In a commentary accompanying The Lancet publication of the OPTIMAS trial, Else Charlotte Sandset, MD, University of Oslo, in Norway, and Diana Aguiar de Sousa, MD, Central Lisbon University Hospital Centre, Lisbon, Portugal, noted that the “increasing body of evidence strongly supports the message that initiating anticoagulation early for patients with ischaemic stroke is safe. The consistent absence of heterogeneity in safety outcomes suggests that the risk of symptomatic intracranial haemorrhage is not a major concern, even in patients with large infarcts.”

Regardless of the size of the treatment effect, initiating early anticoagulation makes sense when it can be done safely, as it helps prevent recurrent ischemic strokes and other embolic events. Early intervention reduces embolization risk, particularly in high-risk patients, and allows secondary prevention measures to begin while patients are still hospitalized, they added.
 

CATALYST Findings

The CATALYST meta-analysis included four trials, namely, TIMING, ELAN, OPTIMAS, and START, of early versus later DOAC administration in a total of 5411 patients with acute ischemic stroke and AF. In this meta-analysis, early was defined as within 4 days of stroke and later as 5 days or more.

The primary outcome was a composite of ischemic stroke, symptomatic, intracerebral hemorrhage, or unclassified stroke at 30 days. This was significantly reduced in the early group (2.12%) versus 3.02% in the later group, giving an odds ratio of 0.70 (95% CI, 0.50-0.98; P =.04).

The results were consistent across all subgroups, all suggesting an advantage for early DOAC.

Further analysis showed a clear benefit of early DOAC initiation in ischemic stroke with the curves separating early.

The rate of symptomatic intracerebral hemorrhage was low in both groups (0.45% in the early group and 0.40% in the later group) as was extracranial hemorrhage (0.45% vs 0.55%).

At 90 days, there were still lower event rates in the early group than the later one, but the difference was no longer statistically significant.
 

‘Practice Changing’ Results

Commenting on both studies, chair of the WSC session where the results of both OPTIMAS trial and the meta-analysis were presented, Craig Anderson, MD, The George Institute for Global Health, Sydney, Australia, described these latest results as “practice changing.”

“When to start anticoagulation in acute ischemic stroke patients with AF has been uncertain for a long time. The dogma has always been that we should wait. Over the years, we’ve become a little bit more confident, but now we’ve got good data from randomized trials showing that early initiation is safe, with the meta-analysis showing benefit,” he said.

“These new data from OPTIMAS will reassure clinicians that there’s no excessive harm and, more importantly, no excessive harm across all patient groups. And the meta-analysis clearly showed an upfront benefit of starting anticoagulation early. That’s a very convincing result,” he added.

Anderson cautioned that there still may be concerns about starting DOACs early in some groups, including Asian populations that have a higher bleeding risk (these trials included predominantly White patients) and people who are older or frail, who may have extensive small vessel disease.

During the discussion, several questions centered on the lack of imaging data available on the patients in the studies. Anderson said imaging data would help reassure clinicians on the safety of early anticoagulation in patients with large infarcts.

“Stroke clinicians make decisions on the basis of the patient and on the basis of the brain, and we only have the patient information at the moment. We don’t have information on the brain — that comes from imaging.”

Regardless, he believes these new data will lead to a shift in practice. “But maybe, it won’t be as dramatic as we would hope because I think some clinicians may still hesitate to apply these results to patients at high risk of bleeding. With imaging data from the studies that might change.”

The OPTIMAS trial was funded by University College London and the British Heart Foundation. Werring reported consulting fees from Novo Nordisk, National Institute for Health and Care Excellence, and Alnylam; payments or speaker honoraria from Novo Nordisk, Bayer, and AstraZeneca/Alexion; participation on a data safety monitoring board for the OXHARP trial; and participation as steering committee chair for the MACE-ICH and PLINTH trials. Åsberg received institutional research grants and lecture fees to her institution from AstraZeneca, Boehringer Ingelheim, Bristol Myers Squibb, and Institut Produits Synthése. Sandset and de Sousa were both steering committee members of the ELAN trial. Anderson reported grant funding from Penumbra and Takeda China.
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM WSC 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

A New, Easily Identifiable Sign of Concussion?

Article Type
Changed

Researchers have identified a potential new sign of concussion in athletes, particularly football players, that can easily be spotted on the field, indicating the need for immediate removal from the game and evaluation for potential traumatic brain injury (TBI).

Spontaneous Headshake After a Kinematic Event (SHAAKE) refers to the rapid, back-and-forth head movement athletes exhibit following a blow to the head. This voluntary motion typically occurs within seconds to minutes after impact and is a familiar response in athletes.

In a recent survey, 7 out of 10 adult athletes recalled making this movement after a collision, and three out of four times they attributed this back-and-forth head movement to a concussion. The association was strongest among football players, who reported that over 90% of SHAAKE episodes were associated with a concussion.

The results were published online in Diagnostics.
 

Call to Action

“Everyone” — including sports and medical organizations — “should be adding this to their list of potential concussion signs and their protocol immediately,” study investigator Chris Nowinski, PhD, CEO and co-founder of the Concussion Legacy Foundation, told this news organization.

Nowinski said it’s “fascinating” that this concussion sign hasn’t been formally studied or added to formal concussion screening metrics before now, given that it’s been depicted in movies, television, and cartoons for decades.

Coaches, medical professionals, and concussion spotters should be trained to recognize when a SHAAKE happens, he said.

“The interesting thing is, I don’t think coaches or parents need much training other than to officially tie this to suspicion of a concussion,” Nowinski added.
 

The Case of Miami Dolphins QB Tua Tagovailoa

Nowinski said he was tipped off to SHAAKE as a concussion sign after Miami Dolphins quarterback Tua Tagovailoa’s controversial undiagnosed concussion during a National Football League (NFL) game in 2022.

After Tagovailoa’s head hit the ground, he rapidly shook his head side to side, indicating displaying SHAAKE, before stumbling and collapsing. At the time, a sideline doctor attributed his collapse to a prior back injury.

If Tagovailoa had been diagnosed with a concussion, he likely would not have been playing in a game just 4 days later, where he lost consciousness after suffering a suspected second concussion and was removed from the field on a stretcher.

For the survey, Nowinski and colleagues showed 347 current and former athletes, including 109 football players, video examples of SHAAKE and them asked about their experiences with this potential indicator of concussion.

Nearly 69% of athletes reported exhibiting a SHAAKE during their career, and 93% of those reported a SHAAKE in association with concussion at least once. Athletes reported SHAAKE a median of five times in their lives.

Of the athletes who reported SHAAKE, 85% linked this head-shaking movement to concussion symptoms such as disorientation (71%) and dizziness (54%).

Across all sports, SHAAKE showed a sensitivity of 49.6% and a positive predictive value (PPV) of 72.4% for diagnosing concussions.

Among football players, sensitivity improved to 52.3%, with an estimated specificity of 99.9%, a PPV of 91.9%, and an estimated negative predictive value of 99.5%.

The main limitation of the survey was the potential for recall bias due to survey participants self-reporting prior concussions. The researchers called for future prospective studies to validate SHAAKE as a sign of concussion.
 

 

 

Instant Replay for Brain Injury?

Experts echoed the need for validation. SHAAKE represents a “promising advance” in objective TBI assessment, particularly for sideline evaluation, said Shaheen Lakhan, MD, PhD, neurologist, and researcher based in Miami, Florida, who wasn’t involved in the research.

The potential value of SHAAKE is “particularly notable given the well-documented tendency for athletes to minimize or conceal symptoms to maintain play eligibility, a limitation that has historically challenged our reliance on subjective reporting and observational assessments,” Lakhan said.

“Moving forward, validation through prospective studies incorporating real-time video analysis, helmet sensor data, and clinician-confirmed TBI diagnoses will be essential. With appropriate validation, SHAAKE could emerge as a valuable component of our sideline assessment arsenal, complementing rather than replacing existing diagnostic approaches,” Lakhan said.

“SHAAKE could be the ‘instant replay’ for brain injuries that sports medicine has been waiting for — but like any new technology, we need to make sure it works for every player, not just some,” Lakhan added.

Also weighing in, Richard Figler, MD, director of the Concussion Center, Cleveland Clinic Sports Medicine Center, Cleveland, cautioned that the survey participants were recruited from a concussion registry and self-reported an average of 23 concussions — more than one third of which happened 5-10 years prior — which begs the question, “How much are they actually remembering?”

“Our goal is to make sure that the athletes are safe and that we’re not missing concussions, and we don’t have great tools to start off with. This study opens up the door for some prospective studies [of SHAAKE] moving forward. I think we need more data before this should be listed as a definitive marker,” said Figler, who also wasn’t involved in the study.

In any case, he said, when it comes to suspected concussion in sports, “when in doubt, you sit them out,” Figler said.

This research received no external funding. Nowinski has received travel reimbursement from the NFL Players Association (NFLPA), NFL, World Rugby, WWE, and All Elite Wrestling; served as an expert witness in cases related to concussion and chronic traumatic encephalopathy; and is compensated for speaking appearances and serving on the NFL Concussion Settlement Player Advocacy Committee. Daniel H. Daneshvar served as an expert witness in legal cases involving brain injury and concussion and received funding from the Football Players Health Study at Harvard University, which is funded by the NFLPA and evaluates patients for the MGH Brain and Body TRUST Center, sponsored in part by the NFLPA. Lakhan and Figler had no relevant disclosures.
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Researchers have identified a potential new sign of concussion in athletes, particularly football players, that can easily be spotted on the field, indicating the need for immediate removal from the game and evaluation for potential traumatic brain injury (TBI).

Spontaneous Headshake After a Kinematic Event (SHAAKE) refers to the rapid, back-and-forth head movement athletes exhibit following a blow to the head. This voluntary motion typically occurs within seconds to minutes after impact and is a familiar response in athletes.

In a recent survey, 7 out of 10 adult athletes recalled making this movement after a collision, and three out of four times they attributed this back-and-forth head movement to a concussion. The association was strongest among football players, who reported that over 90% of SHAAKE episodes were associated with a concussion.

The results were published online in Diagnostics.
 

Call to Action

“Everyone” — including sports and medical organizations — “should be adding this to their list of potential concussion signs and their protocol immediately,” study investigator Chris Nowinski, PhD, CEO and co-founder of the Concussion Legacy Foundation, told this news organization.

Nowinski said it’s “fascinating” that this concussion sign hasn’t been formally studied or added to formal concussion screening metrics before now, given that it’s been depicted in movies, television, and cartoons for decades.

Coaches, medical professionals, and concussion spotters should be trained to recognize when a SHAAKE happens, he said.

“The interesting thing is, I don’t think coaches or parents need much training other than to officially tie this to suspicion of a concussion,” Nowinski added.
 

The Case of Miami Dolphins QB Tua Tagovailoa

Nowinski said he was tipped off to SHAAKE as a concussion sign after Miami Dolphins quarterback Tua Tagovailoa’s controversial undiagnosed concussion during a National Football League (NFL) game in 2022.

After Tagovailoa’s head hit the ground, he rapidly shook his head side to side, indicating displaying SHAAKE, before stumbling and collapsing. At the time, a sideline doctor attributed his collapse to a prior back injury.

If Tagovailoa had been diagnosed with a concussion, he likely would not have been playing in a game just 4 days later, where he lost consciousness after suffering a suspected second concussion and was removed from the field on a stretcher.

For the survey, Nowinski and colleagues showed 347 current and former athletes, including 109 football players, video examples of SHAAKE and them asked about their experiences with this potential indicator of concussion.

Nearly 69% of athletes reported exhibiting a SHAAKE during their career, and 93% of those reported a SHAAKE in association with concussion at least once. Athletes reported SHAAKE a median of five times in their lives.

Of the athletes who reported SHAAKE, 85% linked this head-shaking movement to concussion symptoms such as disorientation (71%) and dizziness (54%).

Across all sports, SHAAKE showed a sensitivity of 49.6% and a positive predictive value (PPV) of 72.4% for diagnosing concussions.

Among football players, sensitivity improved to 52.3%, with an estimated specificity of 99.9%, a PPV of 91.9%, and an estimated negative predictive value of 99.5%.

The main limitation of the survey was the potential for recall bias due to survey participants self-reporting prior concussions. The researchers called for future prospective studies to validate SHAAKE as a sign of concussion.
 

 

 

Instant Replay for Brain Injury?

Experts echoed the need for validation. SHAAKE represents a “promising advance” in objective TBI assessment, particularly for sideline evaluation, said Shaheen Lakhan, MD, PhD, neurologist, and researcher based in Miami, Florida, who wasn’t involved in the research.

The potential value of SHAAKE is “particularly notable given the well-documented tendency for athletes to minimize or conceal symptoms to maintain play eligibility, a limitation that has historically challenged our reliance on subjective reporting and observational assessments,” Lakhan said.

“Moving forward, validation through prospective studies incorporating real-time video analysis, helmet sensor data, and clinician-confirmed TBI diagnoses will be essential. With appropriate validation, SHAAKE could emerge as a valuable component of our sideline assessment arsenal, complementing rather than replacing existing diagnostic approaches,” Lakhan said.

“SHAAKE could be the ‘instant replay’ for brain injuries that sports medicine has been waiting for — but like any new technology, we need to make sure it works for every player, not just some,” Lakhan added.

Also weighing in, Richard Figler, MD, director of the Concussion Center, Cleveland Clinic Sports Medicine Center, Cleveland, cautioned that the survey participants were recruited from a concussion registry and self-reported an average of 23 concussions — more than one third of which happened 5-10 years prior — which begs the question, “How much are they actually remembering?”

“Our goal is to make sure that the athletes are safe and that we’re not missing concussions, and we don’t have great tools to start off with. This study opens up the door for some prospective studies [of SHAAKE] moving forward. I think we need more data before this should be listed as a definitive marker,” said Figler, who also wasn’t involved in the study.

In any case, he said, when it comes to suspected concussion in sports, “when in doubt, you sit them out,” Figler said.

This research received no external funding. Nowinski has received travel reimbursement from the NFL Players Association (NFLPA), NFL, World Rugby, WWE, and All Elite Wrestling; served as an expert witness in cases related to concussion and chronic traumatic encephalopathy; and is compensated for speaking appearances and serving on the NFL Concussion Settlement Player Advocacy Committee. Daniel H. Daneshvar served as an expert witness in legal cases involving brain injury and concussion and received funding from the Football Players Health Study at Harvard University, which is funded by the NFLPA and evaluates patients for the MGH Brain and Body TRUST Center, sponsored in part by the NFLPA. Lakhan and Figler had no relevant disclosures.
 

A version of this article appeared on Medscape.com.

Researchers have identified a potential new sign of concussion in athletes, particularly football players, that can easily be spotted on the field, indicating the need for immediate removal from the game and evaluation for potential traumatic brain injury (TBI).

Spontaneous Headshake After a Kinematic Event (SHAAKE) refers to the rapid, back-and-forth head movement athletes exhibit following a blow to the head. This voluntary motion typically occurs within seconds to minutes after impact and is a familiar response in athletes.

In a recent survey, 7 out of 10 adult athletes recalled making this movement after a collision, and three out of four times they attributed this back-and-forth head movement to a concussion. The association was strongest among football players, who reported that over 90% of SHAAKE episodes were associated with a concussion.

The results were published online in Diagnostics.
 

Call to Action

“Everyone” — including sports and medical organizations — “should be adding this to their list of potential concussion signs and their protocol immediately,” study investigator Chris Nowinski, PhD, CEO and co-founder of the Concussion Legacy Foundation, told this news organization.

Nowinski said it’s “fascinating” that this concussion sign hasn’t been formally studied or added to formal concussion screening metrics before now, given that it’s been depicted in movies, television, and cartoons for decades.

Coaches, medical professionals, and concussion spotters should be trained to recognize when a SHAAKE happens, he said.

“The interesting thing is, I don’t think coaches or parents need much training other than to officially tie this to suspicion of a concussion,” Nowinski added.
 

The Case of Miami Dolphins QB Tua Tagovailoa

Nowinski said he was tipped off to SHAAKE as a concussion sign after Miami Dolphins quarterback Tua Tagovailoa’s controversial undiagnosed concussion during a National Football League (NFL) game in 2022.

After Tagovailoa’s head hit the ground, he rapidly shook his head side to side, indicating displaying SHAAKE, before stumbling and collapsing. At the time, a sideline doctor attributed his collapse to a prior back injury.

If Tagovailoa had been diagnosed with a concussion, he likely would not have been playing in a game just 4 days later, where he lost consciousness after suffering a suspected second concussion and was removed from the field on a stretcher.

For the survey, Nowinski and colleagues showed 347 current and former athletes, including 109 football players, video examples of SHAAKE and them asked about their experiences with this potential indicator of concussion.

Nearly 69% of athletes reported exhibiting a SHAAKE during their career, and 93% of those reported a SHAAKE in association with concussion at least once. Athletes reported SHAAKE a median of five times in their lives.

Of the athletes who reported SHAAKE, 85% linked this head-shaking movement to concussion symptoms such as disorientation (71%) and dizziness (54%).

Across all sports, SHAAKE showed a sensitivity of 49.6% and a positive predictive value (PPV) of 72.4% for diagnosing concussions.

Among football players, sensitivity improved to 52.3%, with an estimated specificity of 99.9%, a PPV of 91.9%, and an estimated negative predictive value of 99.5%.

The main limitation of the survey was the potential for recall bias due to survey participants self-reporting prior concussions. The researchers called for future prospective studies to validate SHAAKE as a sign of concussion.
 

 

 

Instant Replay for Brain Injury?

Experts echoed the need for validation. SHAAKE represents a “promising advance” in objective TBI assessment, particularly for sideline evaluation, said Shaheen Lakhan, MD, PhD, neurologist, and researcher based in Miami, Florida, who wasn’t involved in the research.

The potential value of SHAAKE is “particularly notable given the well-documented tendency for athletes to minimize or conceal symptoms to maintain play eligibility, a limitation that has historically challenged our reliance on subjective reporting and observational assessments,” Lakhan said.

“Moving forward, validation through prospective studies incorporating real-time video analysis, helmet sensor data, and clinician-confirmed TBI diagnoses will be essential. With appropriate validation, SHAAKE could emerge as a valuable component of our sideline assessment arsenal, complementing rather than replacing existing diagnostic approaches,” Lakhan said.

“SHAAKE could be the ‘instant replay’ for brain injuries that sports medicine has been waiting for — but like any new technology, we need to make sure it works for every player, not just some,” Lakhan added.

Also weighing in, Richard Figler, MD, director of the Concussion Center, Cleveland Clinic Sports Medicine Center, Cleveland, cautioned that the survey participants were recruited from a concussion registry and self-reported an average of 23 concussions — more than one third of which happened 5-10 years prior — which begs the question, “How much are they actually remembering?”

“Our goal is to make sure that the athletes are safe and that we’re not missing concussions, and we don’t have great tools to start off with. This study opens up the door for some prospective studies [of SHAAKE] moving forward. I think we need more data before this should be listed as a definitive marker,” said Figler, who also wasn’t involved in the study.

In any case, he said, when it comes to suspected concussion in sports, “when in doubt, you sit them out,” Figler said.

This research received no external funding. Nowinski has received travel reimbursement from the NFL Players Association (NFLPA), NFL, World Rugby, WWE, and All Elite Wrestling; served as an expert witness in cases related to concussion and chronic traumatic encephalopathy; and is compensated for speaking appearances and serving on the NFL Concussion Settlement Player Advocacy Committee. Daniel H. Daneshvar served as an expert witness in legal cases involving brain injury and concussion and received funding from the Football Players Health Study at Harvard University, which is funded by the NFLPA and evaluates patients for the MGH Brain and Body TRUST Center, sponsored in part by the NFLPA. Lakhan and Figler had no relevant disclosures.
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM DIAGNOSTICS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article