Slot System
Featured Buckets
Featured Buckets Admin
Reverse Chronological Sort
Allow Teaser Image

Why do women get Alzheimer’s disease more often than men? Study offers clue

Article Type
Changed
Thu, 12/15/2022 - 15:36

Of the more than 6 million patients with Alzheimer’s disease in the United States aged 65 or older, nearly two-thirds are women. A new study published online in Cell may help explain the gender gap – and offer clues to new treatments for helping patients of both sexes fight back.

Researchers zeroed in on a gene named USP11, found on the X chromosome. People assigned female at birth have two X chromosomes, while people assigned male at birth have one X and one Y. So while all males have one copy of USP11, females have two.
 

The body’s trash collection system

In the normal course of events, the brain creates waste that must be removed lest it becomes toxic. One waste product is the protein tau. Too little tau can damage nerve cells, explained researchers David Kang, PhD, and Jung-A “Alexa” Woo, PhD, who led the study. But too much becomes toxic and can lead to neurodegenerative diseases such as Alzheimer’s disease. In fact, new research suggests that testing for changes in tau may someday help doctors diagnose Alzheimer’s disease earlier.

To manage tau, the brain uses a regulatory protein called ubiquitin to “tag” or signal the body that extra tau should be removed.

USP11’s job is to give instructions to make an enzyme that removes the ubiquitin tag to maintain balance. But if too much of the enzyme is present, too much tau gets untagged – and not enough of it gets cleared.

“Our study showed USP11 is higher in females than males in both humans and in mice,” Dr. Kang said. “That’s already true before the onset of dementia. But once someone has Alzheimer’s disease, USP11 is much higher – regardless of sex.”

The study adds to a growing body of evidence that shows that women may be more vulnerable than men to higher levels of tau, possibly explaining why women are affected by the disease more often than men.

But what if there was a way to “turn off” or deactivate the USP11 gene? Might that help prevent Alzheimer’s disease? And could it be done safely?
 

What happened when the gene was eliminated?

To examine these questions, researchers used a method of gene manipulation to completely delete the USP11 gene in mice. They then examined the mice for changes. The result? The mice seemed fine.

“The mice bred well. Their brains looked fine,” Dr. Woo said.

It would not be possible – or ethical – to remove a gene from humans. But when a medical condition makes a certain gene unhelpful, that gene can be partially blocked or expression of the gene can be reduced with medication. In fact, medications targeting enzymes are common. Examples include statins for cardiovascular disease or HIV treatments that inhibit protease enzymes.

“If we are able to identify some type of medicine that would inhibit USP11, our study suggests it would be well tolerated and benefit women,” Dr. Woo said.

Dr. Kang also cautions that the process for creating such a therapy takes at least 10-15 years. The researchers said they’d like to shorten the timeline and plan to study currently approved FDA medications to see if any might work to target USP11 gene activity – and hopefully bring forth a new treatment for Alzheimer’s disease sooner.

A version of this article first appeared on WebMD.com.

Publications
Topics
Sections

Of the more than 6 million patients with Alzheimer’s disease in the United States aged 65 or older, nearly two-thirds are women. A new study published online in Cell may help explain the gender gap – and offer clues to new treatments for helping patients of both sexes fight back.

Researchers zeroed in on a gene named USP11, found on the X chromosome. People assigned female at birth have two X chromosomes, while people assigned male at birth have one X and one Y. So while all males have one copy of USP11, females have two.
 

The body’s trash collection system

In the normal course of events, the brain creates waste that must be removed lest it becomes toxic. One waste product is the protein tau. Too little tau can damage nerve cells, explained researchers David Kang, PhD, and Jung-A “Alexa” Woo, PhD, who led the study. But too much becomes toxic and can lead to neurodegenerative diseases such as Alzheimer’s disease. In fact, new research suggests that testing for changes in tau may someday help doctors diagnose Alzheimer’s disease earlier.

To manage tau, the brain uses a regulatory protein called ubiquitin to “tag” or signal the body that extra tau should be removed.

USP11’s job is to give instructions to make an enzyme that removes the ubiquitin tag to maintain balance. But if too much of the enzyme is present, too much tau gets untagged – and not enough of it gets cleared.

“Our study showed USP11 is higher in females than males in both humans and in mice,” Dr. Kang said. “That’s already true before the onset of dementia. But once someone has Alzheimer’s disease, USP11 is much higher – regardless of sex.”

The study adds to a growing body of evidence that shows that women may be more vulnerable than men to higher levels of tau, possibly explaining why women are affected by the disease more often than men.

But what if there was a way to “turn off” or deactivate the USP11 gene? Might that help prevent Alzheimer’s disease? And could it be done safely?
 

What happened when the gene was eliminated?

To examine these questions, researchers used a method of gene manipulation to completely delete the USP11 gene in mice. They then examined the mice for changes. The result? The mice seemed fine.

“The mice bred well. Their brains looked fine,” Dr. Woo said.

It would not be possible – or ethical – to remove a gene from humans. But when a medical condition makes a certain gene unhelpful, that gene can be partially blocked or expression of the gene can be reduced with medication. In fact, medications targeting enzymes are common. Examples include statins for cardiovascular disease or HIV treatments that inhibit protease enzymes.

“If we are able to identify some type of medicine that would inhibit USP11, our study suggests it would be well tolerated and benefit women,” Dr. Woo said.

Dr. Kang also cautions that the process for creating such a therapy takes at least 10-15 years. The researchers said they’d like to shorten the timeline and plan to study currently approved FDA medications to see if any might work to target USP11 gene activity – and hopefully bring forth a new treatment for Alzheimer’s disease sooner.

A version of this article first appeared on WebMD.com.

Of the more than 6 million patients with Alzheimer’s disease in the United States aged 65 or older, nearly two-thirds are women. A new study published online in Cell may help explain the gender gap – and offer clues to new treatments for helping patients of both sexes fight back.

Researchers zeroed in on a gene named USP11, found on the X chromosome. People assigned female at birth have two X chromosomes, while people assigned male at birth have one X and one Y. So while all males have one copy of USP11, females have two.
 

The body’s trash collection system

In the normal course of events, the brain creates waste that must be removed lest it becomes toxic. One waste product is the protein tau. Too little tau can damage nerve cells, explained researchers David Kang, PhD, and Jung-A “Alexa” Woo, PhD, who led the study. But too much becomes toxic and can lead to neurodegenerative diseases such as Alzheimer’s disease. In fact, new research suggests that testing for changes in tau may someday help doctors diagnose Alzheimer’s disease earlier.

To manage tau, the brain uses a regulatory protein called ubiquitin to “tag” or signal the body that extra tau should be removed.

USP11’s job is to give instructions to make an enzyme that removes the ubiquitin tag to maintain balance. But if too much of the enzyme is present, too much tau gets untagged – and not enough of it gets cleared.

“Our study showed USP11 is higher in females than males in both humans and in mice,” Dr. Kang said. “That’s already true before the onset of dementia. But once someone has Alzheimer’s disease, USP11 is much higher – regardless of sex.”

The study adds to a growing body of evidence that shows that women may be more vulnerable than men to higher levels of tau, possibly explaining why women are affected by the disease more often than men.

But what if there was a way to “turn off” or deactivate the USP11 gene? Might that help prevent Alzheimer’s disease? And could it be done safely?
 

What happened when the gene was eliminated?

To examine these questions, researchers used a method of gene manipulation to completely delete the USP11 gene in mice. They then examined the mice for changes. The result? The mice seemed fine.

“The mice bred well. Their brains looked fine,” Dr. Woo said.

It would not be possible – or ethical – to remove a gene from humans. But when a medical condition makes a certain gene unhelpful, that gene can be partially blocked or expression of the gene can be reduced with medication. In fact, medications targeting enzymes are common. Examples include statins for cardiovascular disease or HIV treatments that inhibit protease enzymes.

“If we are able to identify some type of medicine that would inhibit USP11, our study suggests it would be well tolerated and benefit women,” Dr. Woo said.

Dr. Kang also cautions that the process for creating such a therapy takes at least 10-15 years. The researchers said they’d like to shorten the timeline and plan to study currently approved FDA medications to see if any might work to target USP11 gene activity – and hopefully bring forth a new treatment for Alzheimer’s disease sooner.

A version of this article first appeared on WebMD.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CELL

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Major life stressors ‘strongly predictive’ of long COVID symptoms

Article Type
Changed
Thu, 12/15/2022 - 15:36

After recovery from acute infection with SARS-CoV-2, major stressful life events such as the death of a loved one or financial insecurity can have a significant impact on the development of long COVID symptoms, new research suggests.

Major life stressors in the year after hospital discharge for COVID-19 are “strongly predictive of a lot of the important outcomes that people may face after COVID,” lead investigator Jennifer A. Frontera, MD, a professor in the department of neurology at New York University Langone Health, said in an interview.

These outcomes include depression, brain fog, fatigue, trouble sleeping, and other long COVID symptoms.

The findings were published online in the Journal of the Neurological Sciences.
 

Major stressful events common

Dr. Frontera and the NYU Neurology COVID-19 study team evaluated 451 adults who survived a COVID hospital stay. Of these, 383 completed a 6-month follow-up, 242 completed a 12-month follow-up, and 174 completed follow-up at both time points. 

Within 1 year of discharge, 77 (17%) patients died and 51% suffered a major stressful life event.

In multivariable analyses, major life stressors – including financial insecurity, food insecurity, death of a close contact, and new disability – were strong independent predictors of disability, trouble with activities of daily living, depression, fatigue, sleep problems, and prolonged post-acute COVID symptoms. The adjusted odds ratios for these outcomes ranged from 2.5 to 20.8. 

The research also confirmed the contribution of traditional risk factors for long COVID symptoms, as shown in past studies. These include older age, poor pre-COVID functional status, and more severe initial COVID-19 infection.

Long-term sequelae of COVID are increasingly recognized as major public health issues. 

It has been estimated that roughly 16 million U.S. adults aged 18-65 years ave long COVID, with the often debilitating symptoms keeping up to 4 million out of work. 
 

Holistic approach

Dr. Frontera said it’s important to realize that “sleep, fatigue, anxiety, depression, even cognition are so interwoven with each other that anything that impacts any one of them could have repercussions on the other.”

She added that it “certainly makes sense that there is an interplay or even a bidirectional relationship between the stressors that people face and how well they can recover after COVID.”

Therapies that lessen the trauma of the most stress-inducing life events need to be a central part of treatment for long COVID, with more research needed to validate the best approaches, Dr. Frontera said.

She also noted that social services or case management resources may be able to help address at least some of the stressors that individuals are under – and it is important to refer them to these resources. Referral to mental health services is also important.

“I think it’s really important to take a holistic approach and try to deal with whatever the problem may be,” said Dr. Frontera.

“I’m a neurologist, but as part of my evaluation, I really need to address if there are life stressors or mental health issues that may be impacting this person’s function,” she added.

The study had no commercial funding. The investigators reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

After recovery from acute infection with SARS-CoV-2, major stressful life events such as the death of a loved one or financial insecurity can have a significant impact on the development of long COVID symptoms, new research suggests.

Major life stressors in the year after hospital discharge for COVID-19 are “strongly predictive of a lot of the important outcomes that people may face after COVID,” lead investigator Jennifer A. Frontera, MD, a professor in the department of neurology at New York University Langone Health, said in an interview.

These outcomes include depression, brain fog, fatigue, trouble sleeping, and other long COVID symptoms.

The findings were published online in the Journal of the Neurological Sciences.
 

Major stressful events common

Dr. Frontera and the NYU Neurology COVID-19 study team evaluated 451 adults who survived a COVID hospital stay. Of these, 383 completed a 6-month follow-up, 242 completed a 12-month follow-up, and 174 completed follow-up at both time points. 

Within 1 year of discharge, 77 (17%) patients died and 51% suffered a major stressful life event.

In multivariable analyses, major life stressors – including financial insecurity, food insecurity, death of a close contact, and new disability – were strong independent predictors of disability, trouble with activities of daily living, depression, fatigue, sleep problems, and prolonged post-acute COVID symptoms. The adjusted odds ratios for these outcomes ranged from 2.5 to 20.8. 

The research also confirmed the contribution of traditional risk factors for long COVID symptoms, as shown in past studies. These include older age, poor pre-COVID functional status, and more severe initial COVID-19 infection.

Long-term sequelae of COVID are increasingly recognized as major public health issues. 

It has been estimated that roughly 16 million U.S. adults aged 18-65 years ave long COVID, with the often debilitating symptoms keeping up to 4 million out of work. 
 

Holistic approach

Dr. Frontera said it’s important to realize that “sleep, fatigue, anxiety, depression, even cognition are so interwoven with each other that anything that impacts any one of them could have repercussions on the other.”

She added that it “certainly makes sense that there is an interplay or even a bidirectional relationship between the stressors that people face and how well they can recover after COVID.”

Therapies that lessen the trauma of the most stress-inducing life events need to be a central part of treatment for long COVID, with more research needed to validate the best approaches, Dr. Frontera said.

She also noted that social services or case management resources may be able to help address at least some of the stressors that individuals are under – and it is important to refer them to these resources. Referral to mental health services is also important.

“I think it’s really important to take a holistic approach and try to deal with whatever the problem may be,” said Dr. Frontera.

“I’m a neurologist, but as part of my evaluation, I really need to address if there are life stressors or mental health issues that may be impacting this person’s function,” she added.

The study had no commercial funding. The investigators reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

After recovery from acute infection with SARS-CoV-2, major stressful life events such as the death of a loved one or financial insecurity can have a significant impact on the development of long COVID symptoms, new research suggests.

Major life stressors in the year after hospital discharge for COVID-19 are “strongly predictive of a lot of the important outcomes that people may face after COVID,” lead investigator Jennifer A. Frontera, MD, a professor in the department of neurology at New York University Langone Health, said in an interview.

These outcomes include depression, brain fog, fatigue, trouble sleeping, and other long COVID symptoms.

The findings were published online in the Journal of the Neurological Sciences.
 

Major stressful events common

Dr. Frontera and the NYU Neurology COVID-19 study team evaluated 451 adults who survived a COVID hospital stay. Of these, 383 completed a 6-month follow-up, 242 completed a 12-month follow-up, and 174 completed follow-up at both time points. 

Within 1 year of discharge, 77 (17%) patients died and 51% suffered a major stressful life event.

In multivariable analyses, major life stressors – including financial insecurity, food insecurity, death of a close contact, and new disability – were strong independent predictors of disability, trouble with activities of daily living, depression, fatigue, sleep problems, and prolonged post-acute COVID symptoms. The adjusted odds ratios for these outcomes ranged from 2.5 to 20.8. 

The research also confirmed the contribution of traditional risk factors for long COVID symptoms, as shown in past studies. These include older age, poor pre-COVID functional status, and more severe initial COVID-19 infection.

Long-term sequelae of COVID are increasingly recognized as major public health issues. 

It has been estimated that roughly 16 million U.S. adults aged 18-65 years ave long COVID, with the often debilitating symptoms keeping up to 4 million out of work. 
 

Holistic approach

Dr. Frontera said it’s important to realize that “sleep, fatigue, anxiety, depression, even cognition are so interwoven with each other that anything that impacts any one of them could have repercussions on the other.”

She added that it “certainly makes sense that there is an interplay or even a bidirectional relationship between the stressors that people face and how well they can recover after COVID.”

Therapies that lessen the trauma of the most stress-inducing life events need to be a central part of treatment for long COVID, with more research needed to validate the best approaches, Dr. Frontera said.

She also noted that social services or case management resources may be able to help address at least some of the stressors that individuals are under – and it is important to refer them to these resources. Referral to mental health services is also important.

“I think it’s really important to take a holistic approach and try to deal with whatever the problem may be,” said Dr. Frontera.

“I’m a neurologist, but as part of my evaluation, I really need to address if there are life stressors or mental health issues that may be impacting this person’s function,” she added.

The study had no commercial funding. The investigators reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF THE NEUROLOGICAL SCIENCES

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

U.S. dementia rate drops as education, women’s employment rises

Article Type
Changed
Thu, 12/15/2022 - 15:36

Dementia prevalence is dropping in the United States, new research shows. New data from the Health and Retirement Study, a nationally representative survey, show that the prevalence of dementia among individuals aged 65 and older dropped from 12.2% in 2000 to 8.5% in 2016 – a 30.1% decrease. In men, the prevalence of dementia fell from 10.2% to 7.0%, while for women, it declined from 13.6% to 9.7%, researchers reported. Their finding were published online in PNAS.

The study also revealed that the proportion of college-educated men in the sample increased from 21.5% in 2000 to 33.7% in 2016, while the proportion of college-educated women increased from 12.3% in 2000 to 23% in 2016.

The findings also show a decline in the dementia prevalence in non-Hispanic Black men, which dropped from 17.2% to 9.9%, a decrease of 42.6%. In non-Hispanic White men, dementia declined 9.3% to 6.6%, or 29.0%.

The investigators also found a substantial increase in the level of education between 2000 and 2016. In addition, they found that, among 74- to 84-year-old women in 2000, 29.5% had worked for more than 30 years during their lifetime versus 59.0% in 2016.

The investigators speculated that the decline in dementia prevalence reflects larger socioeconomic changes in the United States as well as prevention strategies to reduce cardiovascular disease.

A person born around 1920, for example, would have had greater exposure to the Great Depression, while someone born in 1936 would have benefited more from the changes in living standards in the years following World War II, they noted.

“There’s a need for more research on the effect of employment on cognitive reserve. It’s plausible that working is good for your mental cognitive abilities,” said study investigator Péter Hudomiet, PhD, from the RAND Corporation, adding that there may also be benefits that extend beyond working years. It’s possible that women’s greater participation in the workforce gives them more chances to establish relationships that in some cases last well into retirement and provide essential social connection. It’s well known that social isolation has a negative impact on cognition.

“It’s plausible that working is good for your mental cognitive abilities,” he added.

The investigators noted that it is beyond the scope of their study to draw definitive conclusions about the causes of the decline, but they observed that positive trends in employment and standard of living make sense. “They would suggest that as schooling levels continue to rise in the U.S. population in younger generations, the prevalence of dementia would continue to decrease.

The investigators report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Dementia prevalence is dropping in the United States, new research shows. New data from the Health and Retirement Study, a nationally representative survey, show that the prevalence of dementia among individuals aged 65 and older dropped from 12.2% in 2000 to 8.5% in 2016 – a 30.1% decrease. In men, the prevalence of dementia fell from 10.2% to 7.0%, while for women, it declined from 13.6% to 9.7%, researchers reported. Their finding were published online in PNAS.

The study also revealed that the proportion of college-educated men in the sample increased from 21.5% in 2000 to 33.7% in 2016, while the proportion of college-educated women increased from 12.3% in 2000 to 23% in 2016.

The findings also show a decline in the dementia prevalence in non-Hispanic Black men, which dropped from 17.2% to 9.9%, a decrease of 42.6%. In non-Hispanic White men, dementia declined 9.3% to 6.6%, or 29.0%.

The investigators also found a substantial increase in the level of education between 2000 and 2016. In addition, they found that, among 74- to 84-year-old women in 2000, 29.5% had worked for more than 30 years during their lifetime versus 59.0% in 2016.

The investigators speculated that the decline in dementia prevalence reflects larger socioeconomic changes in the United States as well as prevention strategies to reduce cardiovascular disease.

A person born around 1920, for example, would have had greater exposure to the Great Depression, while someone born in 1936 would have benefited more from the changes in living standards in the years following World War II, they noted.

“There’s a need for more research on the effect of employment on cognitive reserve. It’s plausible that working is good for your mental cognitive abilities,” said study investigator Péter Hudomiet, PhD, from the RAND Corporation, adding that there may also be benefits that extend beyond working years. It’s possible that women’s greater participation in the workforce gives them more chances to establish relationships that in some cases last well into retirement and provide essential social connection. It’s well known that social isolation has a negative impact on cognition.

“It’s plausible that working is good for your mental cognitive abilities,” he added.

The investigators noted that it is beyond the scope of their study to draw definitive conclusions about the causes of the decline, but they observed that positive trends in employment and standard of living make sense. “They would suggest that as schooling levels continue to rise in the U.S. population in younger generations, the prevalence of dementia would continue to decrease.

The investigators report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Dementia prevalence is dropping in the United States, new research shows. New data from the Health and Retirement Study, a nationally representative survey, show that the prevalence of dementia among individuals aged 65 and older dropped from 12.2% in 2000 to 8.5% in 2016 – a 30.1% decrease. In men, the prevalence of dementia fell from 10.2% to 7.0%, while for women, it declined from 13.6% to 9.7%, researchers reported. Their finding were published online in PNAS.

The study also revealed that the proportion of college-educated men in the sample increased from 21.5% in 2000 to 33.7% in 2016, while the proportion of college-educated women increased from 12.3% in 2000 to 23% in 2016.

The findings also show a decline in the dementia prevalence in non-Hispanic Black men, which dropped from 17.2% to 9.9%, a decrease of 42.6%. In non-Hispanic White men, dementia declined 9.3% to 6.6%, or 29.0%.

The investigators also found a substantial increase in the level of education between 2000 and 2016. In addition, they found that, among 74- to 84-year-old women in 2000, 29.5% had worked for more than 30 years during their lifetime versus 59.0% in 2016.

The investigators speculated that the decline in dementia prevalence reflects larger socioeconomic changes in the United States as well as prevention strategies to reduce cardiovascular disease.

A person born around 1920, for example, would have had greater exposure to the Great Depression, while someone born in 1936 would have benefited more from the changes in living standards in the years following World War II, they noted.

“There’s a need for more research on the effect of employment on cognitive reserve. It’s plausible that working is good for your mental cognitive abilities,” said study investigator Péter Hudomiet, PhD, from the RAND Corporation, adding that there may also be benefits that extend beyond working years. It’s possible that women’s greater participation in the workforce gives them more chances to establish relationships that in some cases last well into retirement and provide essential social connection. It’s well known that social isolation has a negative impact on cognition.

“It’s plausible that working is good for your mental cognitive abilities,” he added.

The investigators noted that it is beyond the scope of their study to draw definitive conclusions about the causes of the decline, but they observed that positive trends in employment and standard of living make sense. “They would suggest that as schooling levels continue to rise in the U.S. population in younger generations, the prevalence of dementia would continue to decrease.

The investigators report no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

From PNAS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

‘Key cause’ of type 2 diabetes identified

Article Type
Changed
Thu, 12/15/2022 - 15:36

Understanding of the key mechanisms underlying the progression of type 2 diabetes has been advanced by new research from Oxford (England) University suggesting potential ways to “slow the seemingly inexorable decline in beta-cell function in T2D”.

The study in mice elucidated a “key cause” of T2D by showing that high blood glucose reprograms the metabolism of pancreatic beta-cells, helping to explain the progressive decline in their function in diabetes.

Scientists already knew that chronic hyperglycemia leads to a progressive decline in beta-cell function and, conversely, that the failure of pancreatic beta-cells to produce insulin results in chronically elevated blood glucose. However, the exact cause of beta-cell failure in T2D has remained unclear. T2D typically presents in later adult life, and by the time of diagnosis as much as 50% of beta-cell function has been lost.

In the United Kingdom there are nearly 5 million people diagnosed with T2D, which costs the National Health Service some £10 billion annually.
 

Glucose metabolites, rather than glucose itself, drives failure of cells to release insulin

The new study, published in Nature Communications, used both an animal model of diabetes and in vitro culture of beta-cells in a high glucose medium. In both cases the researchers showed, for the first time, that it is glucose metabolites, rather than glucose itself, that drives the failure of beta-cells to release insulin and is key to the progression of type 2 diabetes. 

Senior researcher Frances Ashcroft, PhD, of the department of physiology, anatomy and genetics at the University of Oxford said: “This suggests a potential way in which the decline in beta-cell function in T2D might be slowed or prevented.”

Blood glucose concentration is controlled within narrow limits, the team explained. When it is too low for more than few minutes, consciousness is rapidly lost because the brain is starved of fuel. However chronic elevation of blood glucose leads to the serious complications found in poorly controlled diabetes, such as retinopathy, nephropathy, peripheral neuropathy, and cardiac disease. Insulin, released from pancreatic beta-cells when blood glucose levels rise, is the only hormone that can lower the blood glucose concentration, and insufficient secretion results in diabetes. In T2D, the beta-cells are still present (unlike in T1D), but they have a reduced insulin content and the coupling between glucose and insulin release is impaired. 
 

Vicious spiral of hyperglycemia and beta-cell damage

Previous work by the same team had shown that chronic hyperglycemia damages the ability of the beta-cell to produce insulin and to release it when blood glucose levels rise. This suggested that “prolonged hyperglycemia sets off a vicious spiral in which an increase in blood glucose leads to beta-cell damage and less insulin secretion - which causes an even greater increase in blood glucose and a further decline in beta-cell function,” the team explained. 

Lead researcher Elizabeth Haythorne, PhD, said: “We realized that we next needed to understand how glucose damages beta-cell function, so we can think about how we might stop it and so slow the seemingly inexorable decline in beta-cell function in T2D.”

In the new study, they showed that altered glycolysis in T2D occurs, in part, through marked up-regulation of mammalian target of rapamycin complex 1 (mTORC1), a protein complex involved in control of cell growth, dysregulation of which underlies a variety of human diseases, including diabetes. Up-regulation of mTORC1 led to changes in metabolic gene expression, oxidative phosphorylation and insulin secretion. Furthermore, they demonstrated that reducing the rate at which glucose is metabolized and at which its metabolites build up could prevent the effects of chronic hyperglycemia and the ensuing beta-cell failure. 

“High blood glucose levels cause an increased rate of glucose metabolism in the beta-cell, which leads to a metabolic bottleneck and the pooling of upstream metabolites,” the team said. “These metabolites switch off the insulin gene, so less insulin is made, as well as switching off numerous genes involved in metabolism and stimulus-secretion coupling. Consequently, the beta-cells become glucose blind and no longer respond to changes in blood glucose with insulin secretion.”
 

 

 

Blocking metabolic enzyme could maintain insulin secretion

The team attempted to block the first step in glucose metabolism, and therefore prevent the gene changes from taking place, by blocking the enzyme glucokinase, which regulates the process. They found that this could maintain glucose-stimulated insulin secretion even in the presence of chronic hyperglycemia.

“Our results support the idea that progressive impairment of beta-cell metabolism, induced by increasing hyperglycemia, speeds T2D development, and suggest that reducing glycolysis at the level of glucokinase may slow this progression,” they said.

Dr. Ashcroft said: “This is potentially a useful way to try to prevent beta-cell decline in diabetes. Because glucose metabolism normally stimulates insulin secretion, it was previously hypothesized that increasing glucose metabolism would enhance insulin secretion in T2D and glucokinase activators were trialled, with varying results. 

“Our data suggests that glucokinase activators could have an adverse effect and, somewhat counter-intuitively, that a glucokinase inhibitor might be a better strategy to treat T2D. Of course, it would be important to reduce glucose flux in T2D to that found in people without diabetes – and no further. But there is a very long way to go before we can tell if this approach would be useful for treating beta-cell decline in T2D. 

“In the meantime, the key message from our study if you have type 2 diabetes is that it is important to keep your blood glucose well controlled.”

This study was funded by the UK Medical Research Council, the Biotechnology and Biological Sciences Research Council, the John Fell Fund, and the Nuffield Benefaction for Medicine/Wellcome Institutional Strategic Support Fund. The authors declared no competing interests.

A version of this article first appeared on Medscape UK.

Publications
Topics
Sections

Understanding of the key mechanisms underlying the progression of type 2 diabetes has been advanced by new research from Oxford (England) University suggesting potential ways to “slow the seemingly inexorable decline in beta-cell function in T2D”.

The study in mice elucidated a “key cause” of T2D by showing that high blood glucose reprograms the metabolism of pancreatic beta-cells, helping to explain the progressive decline in their function in diabetes.

Scientists already knew that chronic hyperglycemia leads to a progressive decline in beta-cell function and, conversely, that the failure of pancreatic beta-cells to produce insulin results in chronically elevated blood glucose. However, the exact cause of beta-cell failure in T2D has remained unclear. T2D typically presents in later adult life, and by the time of diagnosis as much as 50% of beta-cell function has been lost.

In the United Kingdom there are nearly 5 million people diagnosed with T2D, which costs the National Health Service some £10 billion annually.
 

Glucose metabolites, rather than glucose itself, drives failure of cells to release insulin

The new study, published in Nature Communications, used both an animal model of diabetes and in vitro culture of beta-cells in a high glucose medium. In both cases the researchers showed, for the first time, that it is glucose metabolites, rather than glucose itself, that drives the failure of beta-cells to release insulin and is key to the progression of type 2 diabetes. 

Senior researcher Frances Ashcroft, PhD, of the department of physiology, anatomy and genetics at the University of Oxford said: “This suggests a potential way in which the decline in beta-cell function in T2D might be slowed or prevented.”

Blood glucose concentration is controlled within narrow limits, the team explained. When it is too low for more than few minutes, consciousness is rapidly lost because the brain is starved of fuel. However chronic elevation of blood glucose leads to the serious complications found in poorly controlled diabetes, such as retinopathy, nephropathy, peripheral neuropathy, and cardiac disease. Insulin, released from pancreatic beta-cells when blood glucose levels rise, is the only hormone that can lower the blood glucose concentration, and insufficient secretion results in diabetes. In T2D, the beta-cells are still present (unlike in T1D), but they have a reduced insulin content and the coupling between glucose and insulin release is impaired. 
 

Vicious spiral of hyperglycemia and beta-cell damage

Previous work by the same team had shown that chronic hyperglycemia damages the ability of the beta-cell to produce insulin and to release it when blood glucose levels rise. This suggested that “prolonged hyperglycemia sets off a vicious spiral in which an increase in blood glucose leads to beta-cell damage and less insulin secretion - which causes an even greater increase in blood glucose and a further decline in beta-cell function,” the team explained. 

Lead researcher Elizabeth Haythorne, PhD, said: “We realized that we next needed to understand how glucose damages beta-cell function, so we can think about how we might stop it and so slow the seemingly inexorable decline in beta-cell function in T2D.”

In the new study, they showed that altered glycolysis in T2D occurs, in part, through marked up-regulation of mammalian target of rapamycin complex 1 (mTORC1), a protein complex involved in control of cell growth, dysregulation of which underlies a variety of human diseases, including diabetes. Up-regulation of mTORC1 led to changes in metabolic gene expression, oxidative phosphorylation and insulin secretion. Furthermore, they demonstrated that reducing the rate at which glucose is metabolized and at which its metabolites build up could prevent the effects of chronic hyperglycemia and the ensuing beta-cell failure. 

“High blood glucose levels cause an increased rate of glucose metabolism in the beta-cell, which leads to a metabolic bottleneck and the pooling of upstream metabolites,” the team said. “These metabolites switch off the insulin gene, so less insulin is made, as well as switching off numerous genes involved in metabolism and stimulus-secretion coupling. Consequently, the beta-cells become glucose blind and no longer respond to changes in blood glucose with insulin secretion.”
 

 

 

Blocking metabolic enzyme could maintain insulin secretion

The team attempted to block the first step in glucose metabolism, and therefore prevent the gene changes from taking place, by blocking the enzyme glucokinase, which regulates the process. They found that this could maintain glucose-stimulated insulin secretion even in the presence of chronic hyperglycemia.

“Our results support the idea that progressive impairment of beta-cell metabolism, induced by increasing hyperglycemia, speeds T2D development, and suggest that reducing glycolysis at the level of glucokinase may slow this progression,” they said.

Dr. Ashcroft said: “This is potentially a useful way to try to prevent beta-cell decline in diabetes. Because glucose metabolism normally stimulates insulin secretion, it was previously hypothesized that increasing glucose metabolism would enhance insulin secretion in T2D and glucokinase activators were trialled, with varying results. 

“Our data suggests that glucokinase activators could have an adverse effect and, somewhat counter-intuitively, that a glucokinase inhibitor might be a better strategy to treat T2D. Of course, it would be important to reduce glucose flux in T2D to that found in people without diabetes – and no further. But there is a very long way to go before we can tell if this approach would be useful for treating beta-cell decline in T2D. 

“In the meantime, the key message from our study if you have type 2 diabetes is that it is important to keep your blood glucose well controlled.”

This study was funded by the UK Medical Research Council, the Biotechnology and Biological Sciences Research Council, the John Fell Fund, and the Nuffield Benefaction for Medicine/Wellcome Institutional Strategic Support Fund. The authors declared no competing interests.

A version of this article first appeared on Medscape UK.

Understanding of the key mechanisms underlying the progression of type 2 diabetes has been advanced by new research from Oxford (England) University suggesting potential ways to “slow the seemingly inexorable decline in beta-cell function in T2D”.

The study in mice elucidated a “key cause” of T2D by showing that high blood glucose reprograms the metabolism of pancreatic beta-cells, helping to explain the progressive decline in their function in diabetes.

Scientists already knew that chronic hyperglycemia leads to a progressive decline in beta-cell function and, conversely, that the failure of pancreatic beta-cells to produce insulin results in chronically elevated blood glucose. However, the exact cause of beta-cell failure in T2D has remained unclear. T2D typically presents in later adult life, and by the time of diagnosis as much as 50% of beta-cell function has been lost.

In the United Kingdom there are nearly 5 million people diagnosed with T2D, which costs the National Health Service some £10 billion annually.
 

Glucose metabolites, rather than glucose itself, drives failure of cells to release insulin

The new study, published in Nature Communications, used both an animal model of diabetes and in vitro culture of beta-cells in a high glucose medium. In both cases the researchers showed, for the first time, that it is glucose metabolites, rather than glucose itself, that drives the failure of beta-cells to release insulin and is key to the progression of type 2 diabetes. 

Senior researcher Frances Ashcroft, PhD, of the department of physiology, anatomy and genetics at the University of Oxford said: “This suggests a potential way in which the decline in beta-cell function in T2D might be slowed or prevented.”

Blood glucose concentration is controlled within narrow limits, the team explained. When it is too low for more than few minutes, consciousness is rapidly lost because the brain is starved of fuel. However chronic elevation of blood glucose leads to the serious complications found in poorly controlled diabetes, such as retinopathy, nephropathy, peripheral neuropathy, and cardiac disease. Insulin, released from pancreatic beta-cells when blood glucose levels rise, is the only hormone that can lower the blood glucose concentration, and insufficient secretion results in diabetes. In T2D, the beta-cells are still present (unlike in T1D), but they have a reduced insulin content and the coupling between glucose and insulin release is impaired. 
 

Vicious spiral of hyperglycemia and beta-cell damage

Previous work by the same team had shown that chronic hyperglycemia damages the ability of the beta-cell to produce insulin and to release it when blood glucose levels rise. This suggested that “prolonged hyperglycemia sets off a vicious spiral in which an increase in blood glucose leads to beta-cell damage and less insulin secretion - which causes an even greater increase in blood glucose and a further decline in beta-cell function,” the team explained. 

Lead researcher Elizabeth Haythorne, PhD, said: “We realized that we next needed to understand how glucose damages beta-cell function, so we can think about how we might stop it and so slow the seemingly inexorable decline in beta-cell function in T2D.”

In the new study, they showed that altered glycolysis in T2D occurs, in part, through marked up-regulation of mammalian target of rapamycin complex 1 (mTORC1), a protein complex involved in control of cell growth, dysregulation of which underlies a variety of human diseases, including diabetes. Up-regulation of mTORC1 led to changes in metabolic gene expression, oxidative phosphorylation and insulin secretion. Furthermore, they demonstrated that reducing the rate at which glucose is metabolized and at which its metabolites build up could prevent the effects of chronic hyperglycemia and the ensuing beta-cell failure. 

“High blood glucose levels cause an increased rate of glucose metabolism in the beta-cell, which leads to a metabolic bottleneck and the pooling of upstream metabolites,” the team said. “These metabolites switch off the insulin gene, so less insulin is made, as well as switching off numerous genes involved in metabolism and stimulus-secretion coupling. Consequently, the beta-cells become glucose blind and no longer respond to changes in blood glucose with insulin secretion.”
 

 

 

Blocking metabolic enzyme could maintain insulin secretion

The team attempted to block the first step in glucose metabolism, and therefore prevent the gene changes from taking place, by blocking the enzyme glucokinase, which regulates the process. They found that this could maintain glucose-stimulated insulin secretion even in the presence of chronic hyperglycemia.

“Our results support the idea that progressive impairment of beta-cell metabolism, induced by increasing hyperglycemia, speeds T2D development, and suggest that reducing glycolysis at the level of glucokinase may slow this progression,” they said.

Dr. Ashcroft said: “This is potentially a useful way to try to prevent beta-cell decline in diabetes. Because glucose metabolism normally stimulates insulin secretion, it was previously hypothesized that increasing glucose metabolism would enhance insulin secretion in T2D and glucokinase activators were trialled, with varying results. 

“Our data suggests that glucokinase activators could have an adverse effect and, somewhat counter-intuitively, that a glucokinase inhibitor might be a better strategy to treat T2D. Of course, it would be important to reduce glucose flux in T2D to that found in people without diabetes – and no further. But there is a very long way to go before we can tell if this approach would be useful for treating beta-cell decline in T2D. 

“In the meantime, the key message from our study if you have type 2 diabetes is that it is important to keep your blood glucose well controlled.”

This study was funded by the UK Medical Research Council, the Biotechnology and Biological Sciences Research Council, the John Fell Fund, and the Nuffield Benefaction for Medicine/Wellcome Institutional Strategic Support Fund. The authors declared no competing interests.

A version of this article first appeared on Medscape UK.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NATURE COMMUNICATIONS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Nutrition for cognition: A missed opportunity in U.S. seniors?

Article Type
Changed
Thu, 02/09/2023 - 15:07

Among older adults who use the U.S. Supplemental Nutrition Assistance Program (SNAP), rates of memory decline appear to be slower than among those who don’t use the program, new research shows. Researchers assessed the memory function of more than 3,500 persons who used SNAP or did not use SNAP over a period of 20 years. They found that those who didn’t use the food benefits program experienced 2 more years of cognitive aging compared with program users.

Of the 3,555 individuals included in the study, all were eligible to use the benefits, but only 559 did, leaving 2,996 participants who did not take advantage of the program.

Low program participation levels translate into a missed opportunity to prevent dementia, said study investigator Adina Zeki Al Hazzouri, PhD, assistant professor of epidemiology at the Columbia Aging Center at Columbia University Mailman School of Public Health in New York.

She said that prior research has shown that stigma may prevent older Americans from using SNAP. “Educational programs are needed to reduce the stigma that the public holds towards SNAP use,” she said.

Policy change could increase usage among older individuals, Dr. Zeki Al Hazzouri noted. Such changes could include simplifying enrollment and reporting procedures, shortening recertification periods, and increasing benefit levels.

The study was published online in Neurology.
 

Memory preservation

Dr. Zeki Al Hazzouri and her team assessed respondents from the Health and Retirement Study (HRS), a representative sample of Americans aged 50 and older. All respondents who were eligible to participate in SNAP in 1996 were followed every 2 years until 2016.

At each assessment, HRS respondents completed memory tests, including immediate and delayed word recall. For those who were too impaired to complete the interview, proxy informants – typically, their spouses or family members – assessed the memory and cognition of their family members using validated instruments, such as the 16-item Informant Questionnaire for Cognitive Decline.

Investigators used a validated memory function composite score, which is benchmarked against the memory assessments and evaluations of the Aging, Demographics, and Memory Study (ADAMS) cohort.

The team found that compared with nonusers, SNAP users were more likely to be women, Black, and born in the southern United States. They were less likely to be married and had more chronic conditions, such as high blood pressure, diabetes, cancer, heart problems, psychiatric problems, and arthritis.

One important study limitation was that SNAP use was measured only once during the study, the investigators noted. Ideally, Dr. Zeki Al Hazzouri said, future research would examine cumulative SNAP use history and explore the pathways that might account for the association between SNAP use and memory decline.

While findings suggest that there were no significant differences in baseline memory function between SNAP users and nonusers, users experienced approximately 2 fewer years of cognitive aging over a 10-year period than those who didn’t use the program.

Dr. Zeki Al Hazzouri speculated that SNAP benefits may slow cognitive aging by contributing to overall brain health and that, in comparison with nonusers, SNAP users absorb more nutrients, which promote neuronal integrity.

The investigators theorized that SNAP benefits may reduce stress from financial hardship, which has been linked to premature cognitive aging in other research.

“SNAP may also increase the purchasing power and investment in other health preserving behaviors, but also resulting in better access to care, which may in turn result in better disease management and management of risk factors for cognitive function,” the investigators wrote.
 

 

 

An underutilized program

In an accompanying editorial, Steven Albert, PhD, Philip B. Hallen Endowed Chair in Community Health and Social Justice at the University of Pittsburgh, noted that in 2020, among households with people aged 50 and older in the United States, more than 9 million Americans experienced food insecurity.

Furthermore, he pointed out, research from 2018 showed that 71% of people aged 60 and older who met income eligibility for SNAP did not participate in the program. “SNAP is an underutilized food security program involving substantial income supplements for older people with low incomes.

“Against the backdrop of so many failures of pharmacotherapy for dementia and the so far inexorable increase in the prevalence of dementia due to population aging, are we missing an opportunity to support cognitive health by failing to enroll the 14 million Americans who are over age 60 and eligible for SNAP but who do not participate?” Dr. Albert asked. He suggested that it would be helpful to determine this through a randomized promotion trial.

The study was funded by the National Institute on Aging. The authors reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 31(2)
Publications
Topics
Sections

Among older adults who use the U.S. Supplemental Nutrition Assistance Program (SNAP), rates of memory decline appear to be slower than among those who don’t use the program, new research shows. Researchers assessed the memory function of more than 3,500 persons who used SNAP or did not use SNAP over a period of 20 years. They found that those who didn’t use the food benefits program experienced 2 more years of cognitive aging compared with program users.

Of the 3,555 individuals included in the study, all were eligible to use the benefits, but only 559 did, leaving 2,996 participants who did not take advantage of the program.

Low program participation levels translate into a missed opportunity to prevent dementia, said study investigator Adina Zeki Al Hazzouri, PhD, assistant professor of epidemiology at the Columbia Aging Center at Columbia University Mailman School of Public Health in New York.

She said that prior research has shown that stigma may prevent older Americans from using SNAP. “Educational programs are needed to reduce the stigma that the public holds towards SNAP use,” she said.

Policy change could increase usage among older individuals, Dr. Zeki Al Hazzouri noted. Such changes could include simplifying enrollment and reporting procedures, shortening recertification periods, and increasing benefit levels.

The study was published online in Neurology.
 

Memory preservation

Dr. Zeki Al Hazzouri and her team assessed respondents from the Health and Retirement Study (HRS), a representative sample of Americans aged 50 and older. All respondents who were eligible to participate in SNAP in 1996 were followed every 2 years until 2016.

At each assessment, HRS respondents completed memory tests, including immediate and delayed word recall. For those who were too impaired to complete the interview, proxy informants – typically, their spouses or family members – assessed the memory and cognition of their family members using validated instruments, such as the 16-item Informant Questionnaire for Cognitive Decline.

Investigators used a validated memory function composite score, which is benchmarked against the memory assessments and evaluations of the Aging, Demographics, and Memory Study (ADAMS) cohort.

The team found that compared with nonusers, SNAP users were more likely to be women, Black, and born in the southern United States. They were less likely to be married and had more chronic conditions, such as high blood pressure, diabetes, cancer, heart problems, psychiatric problems, and arthritis.

One important study limitation was that SNAP use was measured only once during the study, the investigators noted. Ideally, Dr. Zeki Al Hazzouri said, future research would examine cumulative SNAP use history and explore the pathways that might account for the association between SNAP use and memory decline.

While findings suggest that there were no significant differences in baseline memory function between SNAP users and nonusers, users experienced approximately 2 fewer years of cognitive aging over a 10-year period than those who didn’t use the program.

Dr. Zeki Al Hazzouri speculated that SNAP benefits may slow cognitive aging by contributing to overall brain health and that, in comparison with nonusers, SNAP users absorb more nutrients, which promote neuronal integrity.

The investigators theorized that SNAP benefits may reduce stress from financial hardship, which has been linked to premature cognitive aging in other research.

“SNAP may also increase the purchasing power and investment in other health preserving behaviors, but also resulting in better access to care, which may in turn result in better disease management and management of risk factors for cognitive function,” the investigators wrote.
 

 

 

An underutilized program

In an accompanying editorial, Steven Albert, PhD, Philip B. Hallen Endowed Chair in Community Health and Social Justice at the University of Pittsburgh, noted that in 2020, among households with people aged 50 and older in the United States, more than 9 million Americans experienced food insecurity.

Furthermore, he pointed out, research from 2018 showed that 71% of people aged 60 and older who met income eligibility for SNAP did not participate in the program. “SNAP is an underutilized food security program involving substantial income supplements for older people with low incomes.

“Against the backdrop of so many failures of pharmacotherapy for dementia and the so far inexorable increase in the prevalence of dementia due to population aging, are we missing an opportunity to support cognitive health by failing to enroll the 14 million Americans who are over age 60 and eligible for SNAP but who do not participate?” Dr. Albert asked. He suggested that it would be helpful to determine this through a randomized promotion trial.

The study was funded by the National Institute on Aging. The authors reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Among older adults who use the U.S. Supplemental Nutrition Assistance Program (SNAP), rates of memory decline appear to be slower than among those who don’t use the program, new research shows. Researchers assessed the memory function of more than 3,500 persons who used SNAP or did not use SNAP over a period of 20 years. They found that those who didn’t use the food benefits program experienced 2 more years of cognitive aging compared with program users.

Of the 3,555 individuals included in the study, all were eligible to use the benefits, but only 559 did, leaving 2,996 participants who did not take advantage of the program.

Low program participation levels translate into a missed opportunity to prevent dementia, said study investigator Adina Zeki Al Hazzouri, PhD, assistant professor of epidemiology at the Columbia Aging Center at Columbia University Mailman School of Public Health in New York.

She said that prior research has shown that stigma may prevent older Americans from using SNAP. “Educational programs are needed to reduce the stigma that the public holds towards SNAP use,” she said.

Policy change could increase usage among older individuals, Dr. Zeki Al Hazzouri noted. Such changes could include simplifying enrollment and reporting procedures, shortening recertification periods, and increasing benefit levels.

The study was published online in Neurology.
 

Memory preservation

Dr. Zeki Al Hazzouri and her team assessed respondents from the Health and Retirement Study (HRS), a representative sample of Americans aged 50 and older. All respondents who were eligible to participate in SNAP in 1996 were followed every 2 years until 2016.

At each assessment, HRS respondents completed memory tests, including immediate and delayed word recall. For those who were too impaired to complete the interview, proxy informants – typically, their spouses or family members – assessed the memory and cognition of their family members using validated instruments, such as the 16-item Informant Questionnaire for Cognitive Decline.

Investigators used a validated memory function composite score, which is benchmarked against the memory assessments and evaluations of the Aging, Demographics, and Memory Study (ADAMS) cohort.

The team found that compared with nonusers, SNAP users were more likely to be women, Black, and born in the southern United States. They were less likely to be married and had more chronic conditions, such as high blood pressure, diabetes, cancer, heart problems, psychiatric problems, and arthritis.

One important study limitation was that SNAP use was measured only once during the study, the investigators noted. Ideally, Dr. Zeki Al Hazzouri said, future research would examine cumulative SNAP use history and explore the pathways that might account for the association between SNAP use and memory decline.

While findings suggest that there were no significant differences in baseline memory function between SNAP users and nonusers, users experienced approximately 2 fewer years of cognitive aging over a 10-year period than those who didn’t use the program.

Dr. Zeki Al Hazzouri speculated that SNAP benefits may slow cognitive aging by contributing to overall brain health and that, in comparison with nonusers, SNAP users absorb more nutrients, which promote neuronal integrity.

The investigators theorized that SNAP benefits may reduce stress from financial hardship, which has been linked to premature cognitive aging in other research.

“SNAP may also increase the purchasing power and investment in other health preserving behaviors, but also resulting in better access to care, which may in turn result in better disease management and management of risk factors for cognitive function,” the investigators wrote.
 

 

 

An underutilized program

In an accompanying editorial, Steven Albert, PhD, Philip B. Hallen Endowed Chair in Community Health and Social Justice at the University of Pittsburgh, noted that in 2020, among households with people aged 50 and older in the United States, more than 9 million Americans experienced food insecurity.

Furthermore, he pointed out, research from 2018 showed that 71% of people aged 60 and older who met income eligibility for SNAP did not participate in the program. “SNAP is an underutilized food security program involving substantial income supplements for older people with low incomes.

“Against the backdrop of so many failures of pharmacotherapy for dementia and the so far inexorable increase in the prevalence of dementia due to population aging, are we missing an opportunity to support cognitive health by failing to enroll the 14 million Americans who are over age 60 and eligible for SNAP but who do not participate?” Dr. Albert asked. He suggested that it would be helpful to determine this through a randomized promotion trial.

The study was funded by the National Institute on Aging. The authors reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 31(2)
Issue
Neurology Reviews - 31(2)
Publications
Publications
Topics
Article Type
Sections
Article Source

From Neurology

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Traffic-related pollutant tied to increased dementia risk

Article Type
Changed
Thu, 12/15/2022 - 15:36

 

Exposure to a traffic-related air pollutant significantly increases risk for dementia, new research suggests. Results from a meta-analysis, which included a total of more than 90 million people, showed risk for dementia increased 3% for every 1 mg/m3 rise in fine particulate matter (PM2.5) exposure.

Particulate matter is a mixture of solid particles and liquid droplets from the burning of fossil fuels and nitrogen oxide, and also produced from road traffic exhaust.

While the research only showed an association between this type of air pollution and dementia risk, the estimates were consistent across the different analyses used.

“It’s rather sobering that there is this 3% relationship between incidence of dementia and the particulate matter and that it is such a precise estimate,” senior investigator Janet Martin, PharmD, MSc, associate professor of anesthesia & perioperative medicine and epidemiology & biostatistics at Western University’s, London, Ont., told this news organization.

The findings were published online in Neurology.
 

Conflicting results in past studies

Air pollution is a known risk factor for dementia, but studies attempting to pinpoint its exact impact have yielded conflicting results.

Researchers analyzed data from 17 studies with a total of 91.4 million individuals, 6% of whom had dementia. In addition to PM2.5, the investigators also assessed nitrogen oxides, which form smog, nitrogen dioxide, and ozone exposure.

After adjustments for other known risk factors, such as age and gender, results showed that dementia risk increased by 3% for every 1 m3 rise in PM2.5 exposure (adjusted hazard ratio, 1.03; 95% confidence interval, 1.02-1.05).

The associations between dementia and exposure to nitrogen oxides (HR, 1.05; 95% CI, 0.99-1.13), nitrogen dioxide (HR, 1.03; 95% CI, 1.00-1.07) and ozone (HR, 1.01; 95% CI, 0.91-1.11) did not reach statistical significance. However, the confidence intervals were wide enough that clinical relevance cannot be ruled out, Dr. Martin said.

The study did not examine how or if the duration of PM2.5 exposure affected dementia risk. In addition, the investigators were not able to identify a threshold above which dementia risk begins to rise.

The Environmental Pollution Agency considers average yearly exposures up to 12 mcg/m3 to be safe. The World Health Organization sets that limit lower, at 5 mcg/m3.

Dr. Martin noted that more studies are needed to explore those issues, as well as the mechanisms by which air pollutants contribute to the pathology of dementia. However, the clear link between fine particulate matter exposure and increased risk emphasizes the need to address air pollution as a modifiable risk factor for dementia.

“The rising tide of dementia is not something we can easily reverse,” Dr. Martin said. “The evidence has been so elusive for how to treat dementia once you have it, so our biggest opportunity is to prevent it.”

Results from a study published earlier in 2022 estimated that rates of dementia will triple worldwide and double in the United States by 2050 unless steps are taking to mitigate risk factors.

Research also suggests that improving air quality PM2.5 by just 10% results in a 14% decreased risk for dementia.
 

‘Impressive’ pattern

Paul Rosenberg, MD, codirector of the Memory and Alzheimer’s Treatment Center division of geriatric psychiatry at Johns Hopkins University, Baltimore, said that air pollution “is the most prominent environmental risk we’ve found” for dementia. It also “adds to many other lifestyle and comorbidity risks, such as lack of exercise, obesity, depression, hearing loss, etc,” said Dr. Rosenberg, who was not involved with the research.

 

 

He noted what was “most impressive” was that in most of the pooled studies, small particulate air pollution was associated with dementia. “The overall pattern is most impressive and the effect sizes quite consistent over most of the studies,” Dr. Rosenberg said.

The meta-analysis was unfunded. Dr. Martin and Dr. Rosenberg reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

Exposure to a traffic-related air pollutant significantly increases risk for dementia, new research suggests. Results from a meta-analysis, which included a total of more than 90 million people, showed risk for dementia increased 3% for every 1 mg/m3 rise in fine particulate matter (PM2.5) exposure.

Particulate matter is a mixture of solid particles and liquid droplets from the burning of fossil fuels and nitrogen oxide, and also produced from road traffic exhaust.

While the research only showed an association between this type of air pollution and dementia risk, the estimates were consistent across the different analyses used.

“It’s rather sobering that there is this 3% relationship between incidence of dementia and the particulate matter and that it is such a precise estimate,” senior investigator Janet Martin, PharmD, MSc, associate professor of anesthesia & perioperative medicine and epidemiology & biostatistics at Western University’s, London, Ont., told this news organization.

The findings were published online in Neurology.
 

Conflicting results in past studies

Air pollution is a known risk factor for dementia, but studies attempting to pinpoint its exact impact have yielded conflicting results.

Researchers analyzed data from 17 studies with a total of 91.4 million individuals, 6% of whom had dementia. In addition to PM2.5, the investigators also assessed nitrogen oxides, which form smog, nitrogen dioxide, and ozone exposure.

After adjustments for other known risk factors, such as age and gender, results showed that dementia risk increased by 3% for every 1 m3 rise in PM2.5 exposure (adjusted hazard ratio, 1.03; 95% confidence interval, 1.02-1.05).

The associations between dementia and exposure to nitrogen oxides (HR, 1.05; 95% CI, 0.99-1.13), nitrogen dioxide (HR, 1.03; 95% CI, 1.00-1.07) and ozone (HR, 1.01; 95% CI, 0.91-1.11) did not reach statistical significance. However, the confidence intervals were wide enough that clinical relevance cannot be ruled out, Dr. Martin said.

The study did not examine how or if the duration of PM2.5 exposure affected dementia risk. In addition, the investigators were not able to identify a threshold above which dementia risk begins to rise.

The Environmental Pollution Agency considers average yearly exposures up to 12 mcg/m3 to be safe. The World Health Organization sets that limit lower, at 5 mcg/m3.

Dr. Martin noted that more studies are needed to explore those issues, as well as the mechanisms by which air pollutants contribute to the pathology of dementia. However, the clear link between fine particulate matter exposure and increased risk emphasizes the need to address air pollution as a modifiable risk factor for dementia.

“The rising tide of dementia is not something we can easily reverse,” Dr. Martin said. “The evidence has been so elusive for how to treat dementia once you have it, so our biggest opportunity is to prevent it.”

Results from a study published earlier in 2022 estimated that rates of dementia will triple worldwide and double in the United States by 2050 unless steps are taking to mitigate risk factors.

Research also suggests that improving air quality PM2.5 by just 10% results in a 14% decreased risk for dementia.
 

‘Impressive’ pattern

Paul Rosenberg, MD, codirector of the Memory and Alzheimer’s Treatment Center division of geriatric psychiatry at Johns Hopkins University, Baltimore, said that air pollution “is the most prominent environmental risk we’ve found” for dementia. It also “adds to many other lifestyle and comorbidity risks, such as lack of exercise, obesity, depression, hearing loss, etc,” said Dr. Rosenberg, who was not involved with the research.

 

 

He noted what was “most impressive” was that in most of the pooled studies, small particulate air pollution was associated with dementia. “The overall pattern is most impressive and the effect sizes quite consistent over most of the studies,” Dr. Rosenberg said.

The meta-analysis was unfunded. Dr. Martin and Dr. Rosenberg reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

 

Exposure to a traffic-related air pollutant significantly increases risk for dementia, new research suggests. Results from a meta-analysis, which included a total of more than 90 million people, showed risk for dementia increased 3% for every 1 mg/m3 rise in fine particulate matter (PM2.5) exposure.

Particulate matter is a mixture of solid particles and liquid droplets from the burning of fossil fuels and nitrogen oxide, and also produced from road traffic exhaust.

While the research only showed an association between this type of air pollution and dementia risk, the estimates were consistent across the different analyses used.

“It’s rather sobering that there is this 3% relationship between incidence of dementia and the particulate matter and that it is such a precise estimate,” senior investigator Janet Martin, PharmD, MSc, associate professor of anesthesia & perioperative medicine and epidemiology & biostatistics at Western University’s, London, Ont., told this news organization.

The findings were published online in Neurology.
 

Conflicting results in past studies

Air pollution is a known risk factor for dementia, but studies attempting to pinpoint its exact impact have yielded conflicting results.

Researchers analyzed data from 17 studies with a total of 91.4 million individuals, 6% of whom had dementia. In addition to PM2.5, the investigators also assessed nitrogen oxides, which form smog, nitrogen dioxide, and ozone exposure.

After adjustments for other known risk factors, such as age and gender, results showed that dementia risk increased by 3% for every 1 m3 rise in PM2.5 exposure (adjusted hazard ratio, 1.03; 95% confidence interval, 1.02-1.05).

The associations between dementia and exposure to nitrogen oxides (HR, 1.05; 95% CI, 0.99-1.13), nitrogen dioxide (HR, 1.03; 95% CI, 1.00-1.07) and ozone (HR, 1.01; 95% CI, 0.91-1.11) did not reach statistical significance. However, the confidence intervals were wide enough that clinical relevance cannot be ruled out, Dr. Martin said.

The study did not examine how or if the duration of PM2.5 exposure affected dementia risk. In addition, the investigators were not able to identify a threshold above which dementia risk begins to rise.

The Environmental Pollution Agency considers average yearly exposures up to 12 mcg/m3 to be safe. The World Health Organization sets that limit lower, at 5 mcg/m3.

Dr. Martin noted that more studies are needed to explore those issues, as well as the mechanisms by which air pollutants contribute to the pathology of dementia. However, the clear link between fine particulate matter exposure and increased risk emphasizes the need to address air pollution as a modifiable risk factor for dementia.

“The rising tide of dementia is not something we can easily reverse,” Dr. Martin said. “The evidence has been so elusive for how to treat dementia once you have it, so our biggest opportunity is to prevent it.”

Results from a study published earlier in 2022 estimated that rates of dementia will triple worldwide and double in the United States by 2050 unless steps are taking to mitigate risk factors.

Research also suggests that improving air quality PM2.5 by just 10% results in a 14% decreased risk for dementia.
 

‘Impressive’ pattern

Paul Rosenberg, MD, codirector of the Memory and Alzheimer’s Treatment Center division of geriatric psychiatry at Johns Hopkins University, Baltimore, said that air pollution “is the most prominent environmental risk we’ve found” for dementia. It also “adds to many other lifestyle and comorbidity risks, such as lack of exercise, obesity, depression, hearing loss, etc,” said Dr. Rosenberg, who was not involved with the research.

 

 

He noted what was “most impressive” was that in most of the pooled studies, small particulate air pollution was associated with dementia. “The overall pattern is most impressive and the effect sizes quite consistent over most of the studies,” Dr. Rosenberg said.

The meta-analysis was unfunded. Dr. Martin and Dr. Rosenberg reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NEUROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Dementia prevalence study reveals inequities

Article Type
Changed
Thu, 12/15/2022 - 15:36

Dementia and mild cognitive impairment (MCI) disproportionately affect Black and Hispanic individuals, as well as people with less education, based on new U.S. data from The Health and Retirement Study (HRS).

These inequities likely stem from structural racism and income inequality, necessitating a multifaceted response at an institutional level, according to lead author Jennifer J. Manly, PhD, a professor of neuropsychology in neurology at the Gertrude H. Sergievsky Center and the Taub Institute for Research in Aging and Alzheimer’s Disease at Columbia University, New York.
 

A more representative dataset

Between 2001 and 2003, a subset of HRS participants underwent extensive neuropsychological assessment in the Aging, Demographics, and Memory Study (ADAMS), providing data which have since been cited by hundreds of published studies, the investigators wrote in JAMA Neurology. Those data, however, failed to accurately represent the U.S. population at the time, and have not been updated since.

Dr. Jennifer J. Manly

“The ADAMS substudy was small, and the limited inclusion of Black, Hispanic, and American Indian or Alaska Native participants contributed to lack of precision of estimates among minoritized racial and ethnic groups that have been shown to experience a higher burden of cognitive impairment and dementia,” Dr. Manly and colleagues wrote.

The present analysis used a more representative dataset from HRS participants who were 65 years or older in 2016. From June 2016 to October 2017, 3,496 of these individuals underwent comprehensive neuropsychological test battery and informant interview, with dementia and MCI classified based on standard diagnostic criteria.

In total, 393 people were classified with dementia (10%), while 804 had MCI (22%), both of which approximate estimates reported by previous studies, according to the investigators. In further alignment with past research, age was a clear risk factor; each 5-year increment added 17% and 95% increased risk of MCI and dementia, respectively.

Compared with college-educated participants, individuals who did not graduate from high school had a 60% increased risk for both dementia (odds ratio, 1.6; 95% confidence interval, 1.1-2.3) and MCI (OR, 1.6; 95% CI, 1.2-2.2). Other educational strata were not associated with significant differences in risk.

Compared with White participants, Black individuals had an 80% increased risk of dementia (OR, 1.8; 95% CI, 1.2-2.7), but no increased risk of MCI. Conversely, non-White Hispanic individuals had a 40% increased risk of MCI (OR, 1.4; 95% CI, 1.0-2.0), but no increased risk of dementia, compared with White participants.

“Older adults racialized as Black and Hispanic are more likely to develop cognitive impairment and dementia because of historical and current structural racism and income inequality that restrict access to brain-health benefits and increase exposure to harm,” Dr. Manly said in a written comment.

These inequities deserve a comprehensive response, she added.

“Actions and policies that decrease discriminatory and aggressive policing policies, invest in schools that serve children that are racialized as Black and Hispanic, repair housing and economic inequalities, and provide equitable access to mental and physical health, can help to narrow disparities in later life cognitive impairment,” Dr. Manly said. “Two other areas of focus for policy makers are the shortage in the workforce of dementia care specialists, and paid family leave for caregiving.”
 

 

 

Acknowledging the needs of the historically underrepresented

Lealani Mae Acosta, MD, MPH, associate professor of neurology at Vanderbilt University Medical Center, Nashville, Tenn., applauded the investigators for their “conscious effort to expand representation of historically underrepresented minorities.”

Dr. Lealani Mae Acosta

The findings themselves support what has been previously reported, Dr. Acosta said in an interview, including the disproportionate burden of cognitive disorders among people of color and those with less education.

Clinicians need to recognize that certain patient groups face increased risks of cognitive disorders, and should be screened accordingly, Dr. Acosta said, noting that all aging patients should undergo such screening. The push for screening should also occur on a community level, along with efforts to build trust between at-risk populations and health care providers.

While Dr. Acosta reiterated the importance of these new data from Black and Hispanic individuals, she noted that gaps in representation remain, and methods of characterizing populations deserve refinement.

“I’m a little bit biased because I’m an Asian physician,” Dr. Acosta said. “As much as I’m glad that they’re highlighting these different disparities, there weren’t enough [participants in] specific subgroups like American Indian or Alaska Native, Asian, Native Hawaiian or Pacific Islander, to be able to identify specific trends within [those groups] that are, again, historically underrepresented patient populations.”

Grouping all people of Asian descent may also be an oversimplification, she added, as differences may exist between individuals originating from different countries.

“We always have to be careful about lumping certain groups together in analyses,” Dr. Acosta said. “That’s just another reminder to us – as clinicians, as researchers – that we need to do better by our patients by expanding research opportunities, and really studying these historically underrepresented populations.”

The study was supported by the National Institute on Aging. The investigators disclosed additional relationships with the Alzheimer’s Association and the National Institutes of Health. Dr. Acosta reported no relevant competing interests.

Issue
Neurology Reviews - 30(12)
Publications
Topics
Sections

Dementia and mild cognitive impairment (MCI) disproportionately affect Black and Hispanic individuals, as well as people with less education, based on new U.S. data from The Health and Retirement Study (HRS).

These inequities likely stem from structural racism and income inequality, necessitating a multifaceted response at an institutional level, according to lead author Jennifer J. Manly, PhD, a professor of neuropsychology in neurology at the Gertrude H. Sergievsky Center and the Taub Institute for Research in Aging and Alzheimer’s Disease at Columbia University, New York.
 

A more representative dataset

Between 2001 and 2003, a subset of HRS participants underwent extensive neuropsychological assessment in the Aging, Demographics, and Memory Study (ADAMS), providing data which have since been cited by hundreds of published studies, the investigators wrote in JAMA Neurology. Those data, however, failed to accurately represent the U.S. population at the time, and have not been updated since.

Dr. Jennifer J. Manly

“The ADAMS substudy was small, and the limited inclusion of Black, Hispanic, and American Indian or Alaska Native participants contributed to lack of precision of estimates among minoritized racial and ethnic groups that have been shown to experience a higher burden of cognitive impairment and dementia,” Dr. Manly and colleagues wrote.

The present analysis used a more representative dataset from HRS participants who were 65 years or older in 2016. From June 2016 to October 2017, 3,496 of these individuals underwent comprehensive neuropsychological test battery and informant interview, with dementia and MCI classified based on standard diagnostic criteria.

In total, 393 people were classified with dementia (10%), while 804 had MCI (22%), both of which approximate estimates reported by previous studies, according to the investigators. In further alignment with past research, age was a clear risk factor; each 5-year increment added 17% and 95% increased risk of MCI and dementia, respectively.

Compared with college-educated participants, individuals who did not graduate from high school had a 60% increased risk for both dementia (odds ratio, 1.6; 95% confidence interval, 1.1-2.3) and MCI (OR, 1.6; 95% CI, 1.2-2.2). Other educational strata were not associated with significant differences in risk.

Compared with White participants, Black individuals had an 80% increased risk of dementia (OR, 1.8; 95% CI, 1.2-2.7), but no increased risk of MCI. Conversely, non-White Hispanic individuals had a 40% increased risk of MCI (OR, 1.4; 95% CI, 1.0-2.0), but no increased risk of dementia, compared with White participants.

“Older adults racialized as Black and Hispanic are more likely to develop cognitive impairment and dementia because of historical and current structural racism and income inequality that restrict access to brain-health benefits and increase exposure to harm,” Dr. Manly said in a written comment.

These inequities deserve a comprehensive response, she added.

“Actions and policies that decrease discriminatory and aggressive policing policies, invest in schools that serve children that are racialized as Black and Hispanic, repair housing and economic inequalities, and provide equitable access to mental and physical health, can help to narrow disparities in later life cognitive impairment,” Dr. Manly said. “Two other areas of focus for policy makers are the shortage in the workforce of dementia care specialists, and paid family leave for caregiving.”
 

 

 

Acknowledging the needs of the historically underrepresented

Lealani Mae Acosta, MD, MPH, associate professor of neurology at Vanderbilt University Medical Center, Nashville, Tenn., applauded the investigators for their “conscious effort to expand representation of historically underrepresented minorities.”

Dr. Lealani Mae Acosta

The findings themselves support what has been previously reported, Dr. Acosta said in an interview, including the disproportionate burden of cognitive disorders among people of color and those with less education.

Clinicians need to recognize that certain patient groups face increased risks of cognitive disorders, and should be screened accordingly, Dr. Acosta said, noting that all aging patients should undergo such screening. The push for screening should also occur on a community level, along with efforts to build trust between at-risk populations and health care providers.

While Dr. Acosta reiterated the importance of these new data from Black and Hispanic individuals, she noted that gaps in representation remain, and methods of characterizing populations deserve refinement.

“I’m a little bit biased because I’m an Asian physician,” Dr. Acosta said. “As much as I’m glad that they’re highlighting these different disparities, there weren’t enough [participants in] specific subgroups like American Indian or Alaska Native, Asian, Native Hawaiian or Pacific Islander, to be able to identify specific trends within [those groups] that are, again, historically underrepresented patient populations.”

Grouping all people of Asian descent may also be an oversimplification, she added, as differences may exist between individuals originating from different countries.

“We always have to be careful about lumping certain groups together in analyses,” Dr. Acosta said. “That’s just another reminder to us – as clinicians, as researchers – that we need to do better by our patients by expanding research opportunities, and really studying these historically underrepresented populations.”

The study was supported by the National Institute on Aging. The investigators disclosed additional relationships with the Alzheimer’s Association and the National Institutes of Health. Dr. Acosta reported no relevant competing interests.

Dementia and mild cognitive impairment (MCI) disproportionately affect Black and Hispanic individuals, as well as people with less education, based on new U.S. data from The Health and Retirement Study (HRS).

These inequities likely stem from structural racism and income inequality, necessitating a multifaceted response at an institutional level, according to lead author Jennifer J. Manly, PhD, a professor of neuropsychology in neurology at the Gertrude H. Sergievsky Center and the Taub Institute for Research in Aging and Alzheimer’s Disease at Columbia University, New York.
 

A more representative dataset

Between 2001 and 2003, a subset of HRS participants underwent extensive neuropsychological assessment in the Aging, Demographics, and Memory Study (ADAMS), providing data which have since been cited by hundreds of published studies, the investigators wrote in JAMA Neurology. Those data, however, failed to accurately represent the U.S. population at the time, and have not been updated since.

Dr. Jennifer J. Manly

“The ADAMS substudy was small, and the limited inclusion of Black, Hispanic, and American Indian or Alaska Native participants contributed to lack of precision of estimates among minoritized racial and ethnic groups that have been shown to experience a higher burden of cognitive impairment and dementia,” Dr. Manly and colleagues wrote.

The present analysis used a more representative dataset from HRS participants who were 65 years or older in 2016. From June 2016 to October 2017, 3,496 of these individuals underwent comprehensive neuropsychological test battery and informant interview, with dementia and MCI classified based on standard diagnostic criteria.

In total, 393 people were classified with dementia (10%), while 804 had MCI (22%), both of which approximate estimates reported by previous studies, according to the investigators. In further alignment with past research, age was a clear risk factor; each 5-year increment added 17% and 95% increased risk of MCI and dementia, respectively.

Compared with college-educated participants, individuals who did not graduate from high school had a 60% increased risk for both dementia (odds ratio, 1.6; 95% confidence interval, 1.1-2.3) and MCI (OR, 1.6; 95% CI, 1.2-2.2). Other educational strata were not associated with significant differences in risk.

Compared with White participants, Black individuals had an 80% increased risk of dementia (OR, 1.8; 95% CI, 1.2-2.7), but no increased risk of MCI. Conversely, non-White Hispanic individuals had a 40% increased risk of MCI (OR, 1.4; 95% CI, 1.0-2.0), but no increased risk of dementia, compared with White participants.

“Older adults racialized as Black and Hispanic are more likely to develop cognitive impairment and dementia because of historical and current structural racism and income inequality that restrict access to brain-health benefits and increase exposure to harm,” Dr. Manly said in a written comment.

These inequities deserve a comprehensive response, she added.

“Actions and policies that decrease discriminatory and aggressive policing policies, invest in schools that serve children that are racialized as Black and Hispanic, repair housing and economic inequalities, and provide equitable access to mental and physical health, can help to narrow disparities in later life cognitive impairment,” Dr. Manly said. “Two other areas of focus for policy makers are the shortage in the workforce of dementia care specialists, and paid family leave for caregiving.”
 

 

 

Acknowledging the needs of the historically underrepresented

Lealani Mae Acosta, MD, MPH, associate professor of neurology at Vanderbilt University Medical Center, Nashville, Tenn., applauded the investigators for their “conscious effort to expand representation of historically underrepresented minorities.”

Dr. Lealani Mae Acosta

The findings themselves support what has been previously reported, Dr. Acosta said in an interview, including the disproportionate burden of cognitive disorders among people of color and those with less education.

Clinicians need to recognize that certain patient groups face increased risks of cognitive disorders, and should be screened accordingly, Dr. Acosta said, noting that all aging patients should undergo such screening. The push for screening should also occur on a community level, along with efforts to build trust between at-risk populations and health care providers.

While Dr. Acosta reiterated the importance of these new data from Black and Hispanic individuals, she noted that gaps in representation remain, and methods of characterizing populations deserve refinement.

“I’m a little bit biased because I’m an Asian physician,” Dr. Acosta said. “As much as I’m glad that they’re highlighting these different disparities, there weren’t enough [participants in] specific subgroups like American Indian or Alaska Native, Asian, Native Hawaiian or Pacific Islander, to be able to identify specific trends within [those groups] that are, again, historically underrepresented patient populations.”

Grouping all people of Asian descent may also be an oversimplification, she added, as differences may exist between individuals originating from different countries.

“We always have to be careful about lumping certain groups together in analyses,” Dr. Acosta said. “That’s just another reminder to us – as clinicians, as researchers – that we need to do better by our patients by expanding research opportunities, and really studying these historically underrepresented populations.”

The study was supported by the National Institute on Aging. The investigators disclosed additional relationships with the Alzheimer’s Association and the National Institutes of Health. Dr. Acosta reported no relevant competing interests.

Issue
Neurology Reviews - 30(12)
Issue
Neurology Reviews - 30(12)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA NEUROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Viagra, Cialis, and Alzheimer’s risk: New data

Article Type
Changed
Thu, 12/15/2022 - 15:36

Drugs commonly used to treat erectile dysfunction (ED) are not associated with a decreased risk of Alzheimer’s disease and related dementias (ADRD), new research shows.

The findings contradict results from a previous study that suggested that individuals who take sildenafil (Viagra) were significantly less likely to develop Alzheimer’s.

The new research, part of a larger effort to identify existing medications that could be repurposed to treat ADRD, employed a study design that reduced the risk for potential bias that may have influenced the earlier findings, the investigators note.

“That study came out last fall and was widely covered in the media, and we thought there were some methodological shortcomings that might have explained the results,” lead investigator Rishi Desai, PhD, assistant professor of medicine at Harvard Medical School and an associate epidemiologist at Brigham and Women’s Hospital, both in Boston, said in an interview.

The new study was published online in Brain Communications.


 

Not the final word?

Animal studies suggest that phosphodiesterase-5 (PDE5) inhibitors, a drug class that includes the ED drugs sildenafil and tadalafil (Cialis), improve memory and cognitive function and reduce amyloid burden. But studies in humans have yielded conflicting results.*

Although the new research and the work published last year both drew on Medicare data, they examined different patient populations.

The first study compared those who took sildenafil for any reason to those who did not take it. That design likely resulted in an analysis of a comparison of individuals with ED – the most common indication for sildenafil – to generally older individuals with diabetes or hypertension, Dr. Desai said.

In contrast, the current study included only those with pulmonary arterial hypertension (PAH), which is also an indication for PDE5 inhibitors. The researchers compared ADRD incidence in those who took PDE5 inhibitors with the incidence among those who took a different medication to treat their PAH. They used propensity matching to create two groups with similar characteristics and examined the data using four analytic strategies.

The investigators found no significant difference between groups in the incidence of ADRD, regardless of the strategy they used. Cell culture studies also revealed no protective effect from PDE5 inhibitors.

“No study of this kind should claim the final word,” Dr. Desai said. “It is extremely difficult to nail down causality from these types of data sources.”
 

Impressive study design

Commenting on the findings, David Knopman, MD, professor of neurology at Mayo Clinic, Rochester, Minn., described the study design as “impressive” for its efforts to minimize bias, a key limitation in the previous study.

“It was always the case that the claims about sildenafil needed further developmental work prior to testing the drug in randomized controlled trials,” Dr. Knopman said. “The evidence for the use of the drug was never sufficient for clinicians to use it in their patients.”

The study was funded by National Institute on Aging. Dr. Desai is an investigator who receives research grants from Bayer, Vertex, and Novartis that were given to the Brigham and Women’s Hospital for unrelated projects. Dr. Knopman has disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Correction, 11/3/22: An earlier version of this article misstated the abbreviation for phosphodiesterase-5. It is PDE-5.

Publications
Topics
Sections

Drugs commonly used to treat erectile dysfunction (ED) are not associated with a decreased risk of Alzheimer’s disease and related dementias (ADRD), new research shows.

The findings contradict results from a previous study that suggested that individuals who take sildenafil (Viagra) were significantly less likely to develop Alzheimer’s.

The new research, part of a larger effort to identify existing medications that could be repurposed to treat ADRD, employed a study design that reduced the risk for potential bias that may have influenced the earlier findings, the investigators note.

“That study came out last fall and was widely covered in the media, and we thought there were some methodological shortcomings that might have explained the results,” lead investigator Rishi Desai, PhD, assistant professor of medicine at Harvard Medical School and an associate epidemiologist at Brigham and Women’s Hospital, both in Boston, said in an interview.

The new study was published online in Brain Communications.


 

Not the final word?

Animal studies suggest that phosphodiesterase-5 (PDE5) inhibitors, a drug class that includes the ED drugs sildenafil and tadalafil (Cialis), improve memory and cognitive function and reduce amyloid burden. But studies in humans have yielded conflicting results.*

Although the new research and the work published last year both drew on Medicare data, they examined different patient populations.

The first study compared those who took sildenafil for any reason to those who did not take it. That design likely resulted in an analysis of a comparison of individuals with ED – the most common indication for sildenafil – to generally older individuals with diabetes or hypertension, Dr. Desai said.

In contrast, the current study included only those with pulmonary arterial hypertension (PAH), which is also an indication for PDE5 inhibitors. The researchers compared ADRD incidence in those who took PDE5 inhibitors with the incidence among those who took a different medication to treat their PAH. They used propensity matching to create two groups with similar characteristics and examined the data using four analytic strategies.

The investigators found no significant difference between groups in the incidence of ADRD, regardless of the strategy they used. Cell culture studies also revealed no protective effect from PDE5 inhibitors.

“No study of this kind should claim the final word,” Dr. Desai said. “It is extremely difficult to nail down causality from these types of data sources.”
 

Impressive study design

Commenting on the findings, David Knopman, MD, professor of neurology at Mayo Clinic, Rochester, Minn., described the study design as “impressive” for its efforts to minimize bias, a key limitation in the previous study.

“It was always the case that the claims about sildenafil needed further developmental work prior to testing the drug in randomized controlled trials,” Dr. Knopman said. “The evidence for the use of the drug was never sufficient for clinicians to use it in their patients.”

The study was funded by National Institute on Aging. Dr. Desai is an investigator who receives research grants from Bayer, Vertex, and Novartis that were given to the Brigham and Women’s Hospital for unrelated projects. Dr. Knopman has disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Correction, 11/3/22: An earlier version of this article misstated the abbreviation for phosphodiesterase-5. It is PDE-5.

Drugs commonly used to treat erectile dysfunction (ED) are not associated with a decreased risk of Alzheimer’s disease and related dementias (ADRD), new research shows.

The findings contradict results from a previous study that suggested that individuals who take sildenafil (Viagra) were significantly less likely to develop Alzheimer’s.

The new research, part of a larger effort to identify existing medications that could be repurposed to treat ADRD, employed a study design that reduced the risk for potential bias that may have influenced the earlier findings, the investigators note.

“That study came out last fall and was widely covered in the media, and we thought there were some methodological shortcomings that might have explained the results,” lead investigator Rishi Desai, PhD, assistant professor of medicine at Harvard Medical School and an associate epidemiologist at Brigham and Women’s Hospital, both in Boston, said in an interview.

The new study was published online in Brain Communications.


 

Not the final word?

Animal studies suggest that phosphodiesterase-5 (PDE5) inhibitors, a drug class that includes the ED drugs sildenafil and tadalafil (Cialis), improve memory and cognitive function and reduce amyloid burden. But studies in humans have yielded conflicting results.*

Although the new research and the work published last year both drew on Medicare data, they examined different patient populations.

The first study compared those who took sildenafil for any reason to those who did not take it. That design likely resulted in an analysis of a comparison of individuals with ED – the most common indication for sildenafil – to generally older individuals with diabetes or hypertension, Dr. Desai said.

In contrast, the current study included only those with pulmonary arterial hypertension (PAH), which is also an indication for PDE5 inhibitors. The researchers compared ADRD incidence in those who took PDE5 inhibitors with the incidence among those who took a different medication to treat their PAH. They used propensity matching to create two groups with similar characteristics and examined the data using four analytic strategies.

The investigators found no significant difference between groups in the incidence of ADRD, regardless of the strategy they used. Cell culture studies also revealed no protective effect from PDE5 inhibitors.

“No study of this kind should claim the final word,” Dr. Desai said. “It is extremely difficult to nail down causality from these types of data sources.”
 

Impressive study design

Commenting on the findings, David Knopman, MD, professor of neurology at Mayo Clinic, Rochester, Minn., described the study design as “impressive” for its efforts to minimize bias, a key limitation in the previous study.

“It was always the case that the claims about sildenafil needed further developmental work prior to testing the drug in randomized controlled trials,” Dr. Knopman said. “The evidence for the use of the drug was never sufficient for clinicians to use it in their patients.”

The study was funded by National Institute on Aging. Dr. Desai is an investigator who receives research grants from Bayer, Vertex, and Novartis that were given to the Brigham and Women’s Hospital for unrelated projects. Dr. Knopman has disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Correction, 11/3/22: An earlier version of this article misstated the abbreviation for phosphodiesterase-5. It is PDE-5.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM BRAIN COMMUNICATIONS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Finerenone: ‘Striking’ cut in pneumonia, COVID-19 risks

Article Type
Changed
Thu, 12/15/2022 - 15:36

The nonsteroidal mineralocorticoid receptor antagonist finerenone (Kerendia) unexpectedly showed that it might protect against incident infective pneumonia and COVID-19. The finding was based on secondary analyses run on more than 13,000 people enrolled in the two pivotal trials for finerenone.

Finerenone was approved by the Food and Drug Administration in 2021 for slowing progressive renal dysfunction and preventing cardiovascular events in adults with type 2 diabetes and chronic kidney disease (CKD).
 

‘Striking reduction in the risk of pneumonia’

The “striking reduction in risk of pneumonia” in a new analysis suggests that “the propagation of pulmonary infection into lobar or bronchial consolidation may be reduced by finerenone,” write Bertram Pitt, MD, and coauthors in a report published on October 26 in JAMA Network Open.

They also suggest that if further studies confirm that finerenone treatment reduces complications from pneumonia and COVID-19, it would have “significant medical implications,” especially because of the limited treatment options now available for complications from COVID-19.

The new analyses used the FIDELITY dataset, a prespecified merging of results from the FIDELIO-DKD and FIGARO-DKD trials, which together enrolled 13,026 people with type 2 diabetes and CKD, as determined on the basis of the patients’ having a urine albumin-to-creatinine ratio of at least 30 mg/g.

The primary outcomes of these trials showed that treatment with finerenone led to significant slowing of the progression of CKD and a significant reduction in the incidence of cardiovascular events, compared with placebo during median follow-up of 3 years.

The new, secondary analyses focused on the 6.0% of participants in whom there was evidence of pneumonia and the 1.6% in whom there was evidence of having COVID-19. Pneumonia was the most common serious adverse event in the two trials, a finding consistent with the documented risk for pneumonia faced by people with CKD.
 

Finerenone linked with a 29% relative reduction in pneumonia

When analyzed by treatment, the incidence of pneumonia was 4.7% among those who received finerenone and 6.7% among those who received placebo. This translated into a significant relative risk reduction of 29% associated with finerenone treatment.

Analysis of COVID-19 adverse events showed a 1.3% incidence among those who received finerenone and a 1.8% incidence among those in the placebo group, which translated into a significant 27% relative risk reduction linked with finerenone treatment.

In contrast, the data showed no reduced incidence of several other respiratory infections among the finerenone recipients, including nasopharyngitis, bronchitis, and influenza. The data also showed no signal that pneumonia or COVID-19 was more severe among the people who did not receive finerenone, nor did finerenone treatment appear to affect pneumonia recovery.
 

Analysis based on adverse events reports

These secondary analyses are far from definitive. The authors relied on pneumonia and COVID-19 being reported as adverse events. Each investigator diagnosed pneumonia at their discretion, and the trials did not specify diagnostic criteria. The authors also acknowledge that testing for COVID-19 was “not widespread” and that one of the two pivotal trials largely ran prior to the onset of the COVID-19 pandemic so that only 6 participants developed COVID-19 symptoms out of more than 5,700 enrolled.

 

 

The authors hypothesize that several actions of finerenone might potentially help mediate an effect on pneumonia and COVID-19: improvements in pulmonary inflammation and fibrosis, upregulation of expression of angiotensin converting enzyme 2, and amelioration of right heart pressure and pulmonary congestion. Also, antagonizing the mineralocorticoid receptor on monocytes and macrophages may block macrophage infiltration and accumulation of active macrophages, which can mediate the pulmonary tissue damage caused by COVID-19.

The FIDELIO-DKD and FIGARO-DKD trials and the FIDELITY combined database were sponsored by Bayer, the company that markets finerenone (Kerendia). Dr. Pitt has received personal fees from Bayer and personal fees and stock options from numerous other companies. Several coauthors reported having a financial relationship with Bayer, as well as with other companies.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

The nonsteroidal mineralocorticoid receptor antagonist finerenone (Kerendia) unexpectedly showed that it might protect against incident infective pneumonia and COVID-19. The finding was based on secondary analyses run on more than 13,000 people enrolled in the two pivotal trials for finerenone.

Finerenone was approved by the Food and Drug Administration in 2021 for slowing progressive renal dysfunction and preventing cardiovascular events in adults with type 2 diabetes and chronic kidney disease (CKD).
 

‘Striking reduction in the risk of pneumonia’

The “striking reduction in risk of pneumonia” in a new analysis suggests that “the propagation of pulmonary infection into lobar or bronchial consolidation may be reduced by finerenone,” write Bertram Pitt, MD, and coauthors in a report published on October 26 in JAMA Network Open.

They also suggest that if further studies confirm that finerenone treatment reduces complications from pneumonia and COVID-19, it would have “significant medical implications,” especially because of the limited treatment options now available for complications from COVID-19.

The new analyses used the FIDELITY dataset, a prespecified merging of results from the FIDELIO-DKD and FIGARO-DKD trials, which together enrolled 13,026 people with type 2 diabetes and CKD, as determined on the basis of the patients’ having a urine albumin-to-creatinine ratio of at least 30 mg/g.

The primary outcomes of these trials showed that treatment with finerenone led to significant slowing of the progression of CKD and a significant reduction in the incidence of cardiovascular events, compared with placebo during median follow-up of 3 years.

The new, secondary analyses focused on the 6.0% of participants in whom there was evidence of pneumonia and the 1.6% in whom there was evidence of having COVID-19. Pneumonia was the most common serious adverse event in the two trials, a finding consistent with the documented risk for pneumonia faced by people with CKD.
 

Finerenone linked with a 29% relative reduction in pneumonia

When analyzed by treatment, the incidence of pneumonia was 4.7% among those who received finerenone and 6.7% among those who received placebo. This translated into a significant relative risk reduction of 29% associated with finerenone treatment.

Analysis of COVID-19 adverse events showed a 1.3% incidence among those who received finerenone and a 1.8% incidence among those in the placebo group, which translated into a significant 27% relative risk reduction linked with finerenone treatment.

In contrast, the data showed no reduced incidence of several other respiratory infections among the finerenone recipients, including nasopharyngitis, bronchitis, and influenza. The data also showed no signal that pneumonia or COVID-19 was more severe among the people who did not receive finerenone, nor did finerenone treatment appear to affect pneumonia recovery.
 

Analysis based on adverse events reports

These secondary analyses are far from definitive. The authors relied on pneumonia and COVID-19 being reported as adverse events. Each investigator diagnosed pneumonia at their discretion, and the trials did not specify diagnostic criteria. The authors also acknowledge that testing for COVID-19 was “not widespread” and that one of the two pivotal trials largely ran prior to the onset of the COVID-19 pandemic so that only 6 participants developed COVID-19 symptoms out of more than 5,700 enrolled.

 

 

The authors hypothesize that several actions of finerenone might potentially help mediate an effect on pneumonia and COVID-19: improvements in pulmonary inflammation and fibrosis, upregulation of expression of angiotensin converting enzyme 2, and amelioration of right heart pressure and pulmonary congestion. Also, antagonizing the mineralocorticoid receptor on monocytes and macrophages may block macrophage infiltration and accumulation of active macrophages, which can mediate the pulmonary tissue damage caused by COVID-19.

The FIDELIO-DKD and FIGARO-DKD trials and the FIDELITY combined database were sponsored by Bayer, the company that markets finerenone (Kerendia). Dr. Pitt has received personal fees from Bayer and personal fees and stock options from numerous other companies. Several coauthors reported having a financial relationship with Bayer, as well as with other companies.

A version of this article first appeared on Medscape.com.

The nonsteroidal mineralocorticoid receptor antagonist finerenone (Kerendia) unexpectedly showed that it might protect against incident infective pneumonia and COVID-19. The finding was based on secondary analyses run on more than 13,000 people enrolled in the two pivotal trials for finerenone.

Finerenone was approved by the Food and Drug Administration in 2021 for slowing progressive renal dysfunction and preventing cardiovascular events in adults with type 2 diabetes and chronic kidney disease (CKD).
 

‘Striking reduction in the risk of pneumonia’

The “striking reduction in risk of pneumonia” in a new analysis suggests that “the propagation of pulmonary infection into lobar or bronchial consolidation may be reduced by finerenone,” write Bertram Pitt, MD, and coauthors in a report published on October 26 in JAMA Network Open.

They also suggest that if further studies confirm that finerenone treatment reduces complications from pneumonia and COVID-19, it would have “significant medical implications,” especially because of the limited treatment options now available for complications from COVID-19.

The new analyses used the FIDELITY dataset, a prespecified merging of results from the FIDELIO-DKD and FIGARO-DKD trials, which together enrolled 13,026 people with type 2 diabetes and CKD, as determined on the basis of the patients’ having a urine albumin-to-creatinine ratio of at least 30 mg/g.

The primary outcomes of these trials showed that treatment with finerenone led to significant slowing of the progression of CKD and a significant reduction in the incidence of cardiovascular events, compared with placebo during median follow-up of 3 years.

The new, secondary analyses focused on the 6.0% of participants in whom there was evidence of pneumonia and the 1.6% in whom there was evidence of having COVID-19. Pneumonia was the most common serious adverse event in the two trials, a finding consistent with the documented risk for pneumonia faced by people with CKD.
 

Finerenone linked with a 29% relative reduction in pneumonia

When analyzed by treatment, the incidence of pneumonia was 4.7% among those who received finerenone and 6.7% among those who received placebo. This translated into a significant relative risk reduction of 29% associated with finerenone treatment.

Analysis of COVID-19 adverse events showed a 1.3% incidence among those who received finerenone and a 1.8% incidence among those in the placebo group, which translated into a significant 27% relative risk reduction linked with finerenone treatment.

In contrast, the data showed no reduced incidence of several other respiratory infections among the finerenone recipients, including nasopharyngitis, bronchitis, and influenza. The data also showed no signal that pneumonia or COVID-19 was more severe among the people who did not receive finerenone, nor did finerenone treatment appear to affect pneumonia recovery.
 

Analysis based on adverse events reports

These secondary analyses are far from definitive. The authors relied on pneumonia and COVID-19 being reported as adverse events. Each investigator diagnosed pneumonia at their discretion, and the trials did not specify diagnostic criteria. The authors also acknowledge that testing for COVID-19 was “not widespread” and that one of the two pivotal trials largely ran prior to the onset of the COVID-19 pandemic so that only 6 participants developed COVID-19 symptoms out of more than 5,700 enrolled.

 

 

The authors hypothesize that several actions of finerenone might potentially help mediate an effect on pneumonia and COVID-19: improvements in pulmonary inflammation and fibrosis, upregulation of expression of angiotensin converting enzyme 2, and amelioration of right heart pressure and pulmonary congestion. Also, antagonizing the mineralocorticoid receptor on monocytes and macrophages may block macrophage infiltration and accumulation of active macrophages, which can mediate the pulmonary tissue damage caused by COVID-19.

The FIDELIO-DKD and FIGARO-DKD trials and the FIDELITY combined database were sponsored by Bayer, the company that markets finerenone (Kerendia). Dr. Pitt has received personal fees from Bayer and personal fees and stock options from numerous other companies. Several coauthors reported having a financial relationship with Bayer, as well as with other companies.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

A special part of the brain lights up when we see food

Article Type
Changed
Thu, 12/15/2022 - 15:36

“We eat first with our eyes.” 

The Roman foodie Apicius is thought to have uttered those words in the 1st century A.D. Now, some 2,000 years later, scientists may be proving him right. 

Massachusetts Institute of Technology researchers have discovered a previously unknown part of the brain that lights up when we see food. Dubbed the “ventral food component,” this part resides in the brain’s visual cortex, in a region known to play a role in identifying faces, scenes, and words. 

The study, published in the journal Current Biology, involved using artificial intelligence (AI) technology to build a computer model of this part of the brain. Similar models are emerging across fields of research to simulate and study complex systems of the body. A computer model of the digestive system was recently used to determine the best body position for taking a pill

“The research is still cutting-edge,” says study author Meenakshi Khosla, PhD. “There’s a lot more to be done to understand whether this region is the same or different in different individuals, and how it is modulated by experience or familiarity with different kinds of foods.”

Pinpointing those differences could provide insights into how people choose what they eat, or even help us learn what drives eating disorders, Dr. Khosla says. 

Part of what makes this study unique was the researchers’ approach, dubbed “hypothesis neutral.” Instead of setting out to prove or disprove a firm hypothesis, they simply started exploring the data to see what they could find. The goal: To go beyond “the idiosyncratic hypotheses scientists have already thought to test,” the paper says. So, they began sifting through a public database called the Natural Scenes Dataset, an inventory of brain scans from eight volunteers viewing 56,720 images. 

As expected, the software analyzing the dataset spotted brain regions already known to be triggered by images of faces, bodies, words, and scenes. But to the researchers’ surprise, the analysis also revealed a previously unknown part of the brain that seemed to be responding to images of food. 

“Our first reaction was, ‘That’s cute and all, but it can’t possibly be true,’ ” Dr. Khosla says. 

To confirm their discovery, the researchers used the data to train a computer model of this part of the brain, a process that takes less than an hour. Then they fed the model more than 1.2 million new images. 

Sure enough, the model lit up in response to food. Color didn’t matter – even black-and-white food images triggered it, though not as strongly as color ones. And the model could tell the difference between food and objects that looked like food: a banana versus a crescent moon, or a blueberry muffin versus a puppy with a muffin-like face. 

From the human data, the researchers found that some people responded slightly more to processed foods like pizza than unprocessed foods like apples. They hope to explore how other things, such as liking or disliking a food, may affect a person’s response to that food. 

This technology could open up other areas of research as well. Dr. Khosla hopes to use it to explore how the brain responds to social cues like body language and facial expressions. 

For now, Dr. Khosla has already begun to verify the computer model in real people by scanning the brains of a new set of volunteers. “We collected pilot data in a few subjects recently and were able to localize this component,” she says. 

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

“We eat first with our eyes.” 

The Roman foodie Apicius is thought to have uttered those words in the 1st century A.D. Now, some 2,000 years later, scientists may be proving him right. 

Massachusetts Institute of Technology researchers have discovered a previously unknown part of the brain that lights up when we see food. Dubbed the “ventral food component,” this part resides in the brain’s visual cortex, in a region known to play a role in identifying faces, scenes, and words. 

The study, published in the journal Current Biology, involved using artificial intelligence (AI) technology to build a computer model of this part of the brain. Similar models are emerging across fields of research to simulate and study complex systems of the body. A computer model of the digestive system was recently used to determine the best body position for taking a pill

“The research is still cutting-edge,” says study author Meenakshi Khosla, PhD. “There’s a lot more to be done to understand whether this region is the same or different in different individuals, and how it is modulated by experience or familiarity with different kinds of foods.”

Pinpointing those differences could provide insights into how people choose what they eat, or even help us learn what drives eating disorders, Dr. Khosla says. 

Part of what makes this study unique was the researchers’ approach, dubbed “hypothesis neutral.” Instead of setting out to prove or disprove a firm hypothesis, they simply started exploring the data to see what they could find. The goal: To go beyond “the idiosyncratic hypotheses scientists have already thought to test,” the paper says. So, they began sifting through a public database called the Natural Scenes Dataset, an inventory of brain scans from eight volunteers viewing 56,720 images. 

As expected, the software analyzing the dataset spotted brain regions already known to be triggered by images of faces, bodies, words, and scenes. But to the researchers’ surprise, the analysis also revealed a previously unknown part of the brain that seemed to be responding to images of food. 

“Our first reaction was, ‘That’s cute and all, but it can’t possibly be true,’ ” Dr. Khosla says. 

To confirm their discovery, the researchers used the data to train a computer model of this part of the brain, a process that takes less than an hour. Then they fed the model more than 1.2 million new images. 

Sure enough, the model lit up in response to food. Color didn’t matter – even black-and-white food images triggered it, though not as strongly as color ones. And the model could tell the difference between food and objects that looked like food: a banana versus a crescent moon, or a blueberry muffin versus a puppy with a muffin-like face. 

From the human data, the researchers found that some people responded slightly more to processed foods like pizza than unprocessed foods like apples. They hope to explore how other things, such as liking or disliking a food, may affect a person’s response to that food. 

This technology could open up other areas of research as well. Dr. Khosla hopes to use it to explore how the brain responds to social cues like body language and facial expressions. 

For now, Dr. Khosla has already begun to verify the computer model in real people by scanning the brains of a new set of volunteers. “We collected pilot data in a few subjects recently and were able to localize this component,” she says. 

A version of this article first appeared on Medscape.com.

“We eat first with our eyes.” 

The Roman foodie Apicius is thought to have uttered those words in the 1st century A.D. Now, some 2,000 years later, scientists may be proving him right. 

Massachusetts Institute of Technology researchers have discovered a previously unknown part of the brain that lights up when we see food. Dubbed the “ventral food component,” this part resides in the brain’s visual cortex, in a region known to play a role in identifying faces, scenes, and words. 

The study, published in the journal Current Biology, involved using artificial intelligence (AI) technology to build a computer model of this part of the brain. Similar models are emerging across fields of research to simulate and study complex systems of the body. A computer model of the digestive system was recently used to determine the best body position for taking a pill

“The research is still cutting-edge,” says study author Meenakshi Khosla, PhD. “There’s a lot more to be done to understand whether this region is the same or different in different individuals, and how it is modulated by experience or familiarity with different kinds of foods.”

Pinpointing those differences could provide insights into how people choose what they eat, or even help us learn what drives eating disorders, Dr. Khosla says. 

Part of what makes this study unique was the researchers’ approach, dubbed “hypothesis neutral.” Instead of setting out to prove or disprove a firm hypothesis, they simply started exploring the data to see what they could find. The goal: To go beyond “the idiosyncratic hypotheses scientists have already thought to test,” the paper says. So, they began sifting through a public database called the Natural Scenes Dataset, an inventory of brain scans from eight volunteers viewing 56,720 images. 

As expected, the software analyzing the dataset spotted brain regions already known to be triggered by images of faces, bodies, words, and scenes. But to the researchers’ surprise, the analysis also revealed a previously unknown part of the brain that seemed to be responding to images of food. 

“Our first reaction was, ‘That’s cute and all, but it can’t possibly be true,’ ” Dr. Khosla says. 

To confirm their discovery, the researchers used the data to train a computer model of this part of the brain, a process that takes less than an hour. Then they fed the model more than 1.2 million new images. 

Sure enough, the model lit up in response to food. Color didn’t matter – even black-and-white food images triggered it, though not as strongly as color ones. And the model could tell the difference between food and objects that looked like food: a banana versus a crescent moon, or a blueberry muffin versus a puppy with a muffin-like face. 

From the human data, the researchers found that some people responded slightly more to processed foods like pizza than unprocessed foods like apples. They hope to explore how other things, such as liking or disliking a food, may affect a person’s response to that food. 

This technology could open up other areas of research as well. Dr. Khosla hopes to use it to explore how the brain responds to social cues like body language and facial expressions. 

For now, Dr. Khosla has already begun to verify the computer model in real people by scanning the brains of a new set of volunteers. “We collected pilot data in a few subjects recently and were able to localize this component,” she says. 

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CURRENT BIOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article