Customized Video Games Promising for ADHD, Depression, in Children

Article Type
Changed

Targeted video games could help reduce symptoms of attention-deficit/hyperactivity disorder (ADHD) and depression in children and adolescents, results of a new review and meta-analysis suggested.

Although the video game–based or “gamified” digital mental health interventions (DMHIs) were associated with modest improvements in ADHD symptoms and depression, investigators found no significant benefit in the treatment of anxiety.

“The studies are showing these video games really do work, at least for ADHD and depression but maybe not for anxiety,” said Barry Bryant, MD, Department of Psychiatry and Behavioral Sciences, Johns Hopkins University School of Medicine, Baltimore.

“The results may assist clinicians as they make recommendations to patients and parents regarding the efficacy of using these video games to treat mental health conditions.”

The findings were presented at the American Psychiatric Association (APA) 2024 Annual Meeting.
 

A Major Problem

Childhood mental illness is a “big problem,” with about 20% of children facing some mental health challenge such as ADHD, anxiety, or depression, said Dr. Bryant. Unfortunately, these youngsters typically have to wait a while to see a provider, he added.

DMHIs may be an option to consider in the meantime to help meet the increasing demand for treatment, he said.

Gamified DMHIs are like other video games, in that players advance in levels on digital platforms and are rewarded for progress. But they’re created specifically to target certain mental health conditions.

An ADHD game, for example, might involve users completing activities that require an increasing degree of attention. Games focused on depression might incorporate mindfulness and meditation practices or cognitive behavioral elements.

Experts in child psychiatry are involved in developing such games along with professionals in business and video game technology, said Dr. Bryant.

But the question is: Do these games really work?
 

Effective for ADHD, Depression

Investigators reviewed nearly 30 randomized controlled trials of gamified DMHIs as a treatment for anxiety, depression, and/or ADHD in people younger than 18 years that were published from January 1, 1990, to April 7, 2023.

The trials tested a wide variety of gamified DMHIs that fit the inclusion criteria: A control condition, a digital game intervention, sufficient data to calculate effect size, and available in English.

A meta-analysis was performed to examine the therapeutic effects of the gamified DMHIs for ADHD, depression, and anxiety. For all studies, the active treatment was compared with the control condition using Hedges’ g to measure effect size and 95% CIs.

Dr. Bryant noted there was significant heterogeneity of therapeutic effects between the studies and their corresponding gamified interventions.

The study found gamified DMHIs had a modest therapeutic effect for treating ADHD (pooled g = 0.280; P = .005) and depression (pooled g = 0.279; P = .005) in children and adolescents.

But games targeting anxiety didn’t seem to have the same positive impact (pooled g = 0.074; P = .197).

The results suggest the games “show potential and promise” for certain mental health conditions and could offer a “bridge” to accessing more traditional therapies, Dr. Bryant said.

“Maybe this is something that can help these children until they can get to see a psychiatrist, or it could be part of a comprehensive treatment plan,” he said.

The goal is to “make something that kids want to play and engage with” especially if they’re reluctant to sit in a therapist’s office.

The results provide clinicians with information they can actually use in their practices, said Dr. Bryant, adding that his team hopes to get their study published.
 

 

 

Gaining Traction

Commenting on the research, James Sherer, MD, medical director, Addiction Psychiatry, Overlook Medical Center, Atlantic Health System, said the study shows the literature supports video games, and these games “are gaining traction” in the field.

He noted the app for one such game, EndeavorRx, was one of the first to be approved by the US Food and Drug Administration (FDA) to treat ADHD in young people aged 8-17 years.

EndeavorRx challenges players to chase mystic creatures, race through different worlds, and use “boosts” to problem-solve while building their own universe, according to the company website.

By being incentivized to engage in certain activities, “there’s a level of executive functioning that’s being exercised and the idea is to do that repetitively,” said Dr. Sherer.

Users and their parents report improved ADHD symptoms after playing the game. One of the studies included in the review found 73% of children who played EndeavorRx reported improvement in their attention.

The company says there have been no serious adverse events seen in any clinical trial of EndeavorRx.

Dr. Sherer noted that many child psychiatrists play some sort of video game with their young patients who may be on the autism spectrum or have a learning disability.

“That may be one of the few ways to communicate with and effectively bond with the patient,” he said.

Despite their reputation of being violent and associated with “toxic subcultures,” video games can do a lot of good and be “restorative” for patients of all ages, Dr. Sherer added.

No relevant conflicts of interest were disclosed.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Targeted video games could help reduce symptoms of attention-deficit/hyperactivity disorder (ADHD) and depression in children and adolescents, results of a new review and meta-analysis suggested.

Although the video game–based or “gamified” digital mental health interventions (DMHIs) were associated with modest improvements in ADHD symptoms and depression, investigators found no significant benefit in the treatment of anxiety.

“The studies are showing these video games really do work, at least for ADHD and depression but maybe not for anxiety,” said Barry Bryant, MD, Department of Psychiatry and Behavioral Sciences, Johns Hopkins University School of Medicine, Baltimore.

“The results may assist clinicians as they make recommendations to patients and parents regarding the efficacy of using these video games to treat mental health conditions.”

The findings were presented at the American Psychiatric Association (APA) 2024 Annual Meeting.
 

A Major Problem

Childhood mental illness is a “big problem,” with about 20% of children facing some mental health challenge such as ADHD, anxiety, or depression, said Dr. Bryant. Unfortunately, these youngsters typically have to wait a while to see a provider, he added.

DMHIs may be an option to consider in the meantime to help meet the increasing demand for treatment, he said.

Gamified DMHIs are like other video games, in that players advance in levels on digital platforms and are rewarded for progress. But they’re created specifically to target certain mental health conditions.

An ADHD game, for example, might involve users completing activities that require an increasing degree of attention. Games focused on depression might incorporate mindfulness and meditation practices or cognitive behavioral elements.

Experts in child psychiatry are involved in developing such games along with professionals in business and video game technology, said Dr. Bryant.

But the question is: Do these games really work?
 

Effective for ADHD, Depression

Investigators reviewed nearly 30 randomized controlled trials of gamified DMHIs as a treatment for anxiety, depression, and/or ADHD in people younger than 18 years that were published from January 1, 1990, to April 7, 2023.

The trials tested a wide variety of gamified DMHIs that fit the inclusion criteria: A control condition, a digital game intervention, sufficient data to calculate effect size, and available in English.

A meta-analysis was performed to examine the therapeutic effects of the gamified DMHIs for ADHD, depression, and anxiety. For all studies, the active treatment was compared with the control condition using Hedges’ g to measure effect size and 95% CIs.

Dr. Bryant noted there was significant heterogeneity of therapeutic effects between the studies and their corresponding gamified interventions.

The study found gamified DMHIs had a modest therapeutic effect for treating ADHD (pooled g = 0.280; P = .005) and depression (pooled g = 0.279; P = .005) in children and adolescents.

But games targeting anxiety didn’t seem to have the same positive impact (pooled g = 0.074; P = .197).

The results suggest the games “show potential and promise” for certain mental health conditions and could offer a “bridge” to accessing more traditional therapies, Dr. Bryant said.

“Maybe this is something that can help these children until they can get to see a psychiatrist, or it could be part of a comprehensive treatment plan,” he said.

The goal is to “make something that kids want to play and engage with” especially if they’re reluctant to sit in a therapist’s office.

The results provide clinicians with information they can actually use in their practices, said Dr. Bryant, adding that his team hopes to get their study published.
 

 

 

Gaining Traction

Commenting on the research, James Sherer, MD, medical director, Addiction Psychiatry, Overlook Medical Center, Atlantic Health System, said the study shows the literature supports video games, and these games “are gaining traction” in the field.

He noted the app for one such game, EndeavorRx, was one of the first to be approved by the US Food and Drug Administration (FDA) to treat ADHD in young people aged 8-17 years.

EndeavorRx challenges players to chase mystic creatures, race through different worlds, and use “boosts” to problem-solve while building their own universe, according to the company website.

By being incentivized to engage in certain activities, “there’s a level of executive functioning that’s being exercised and the idea is to do that repetitively,” said Dr. Sherer.

Users and their parents report improved ADHD symptoms after playing the game. One of the studies included in the review found 73% of children who played EndeavorRx reported improvement in their attention.

The company says there have been no serious adverse events seen in any clinical trial of EndeavorRx.

Dr. Sherer noted that many child psychiatrists play some sort of video game with their young patients who may be on the autism spectrum or have a learning disability.

“That may be one of the few ways to communicate with and effectively bond with the patient,” he said.

Despite their reputation of being violent and associated with “toxic subcultures,” video games can do a lot of good and be “restorative” for patients of all ages, Dr. Sherer added.

No relevant conflicts of interest were disclosed.

A version of this article appeared on Medscape.com.

Targeted video games could help reduce symptoms of attention-deficit/hyperactivity disorder (ADHD) and depression in children and adolescents, results of a new review and meta-analysis suggested.

Although the video game–based or “gamified” digital mental health interventions (DMHIs) were associated with modest improvements in ADHD symptoms and depression, investigators found no significant benefit in the treatment of anxiety.

“The studies are showing these video games really do work, at least for ADHD and depression but maybe not for anxiety,” said Barry Bryant, MD, Department of Psychiatry and Behavioral Sciences, Johns Hopkins University School of Medicine, Baltimore.

“The results may assist clinicians as they make recommendations to patients and parents regarding the efficacy of using these video games to treat mental health conditions.”

The findings were presented at the American Psychiatric Association (APA) 2024 Annual Meeting.
 

A Major Problem

Childhood mental illness is a “big problem,” with about 20% of children facing some mental health challenge such as ADHD, anxiety, or depression, said Dr. Bryant. Unfortunately, these youngsters typically have to wait a while to see a provider, he added.

DMHIs may be an option to consider in the meantime to help meet the increasing demand for treatment, he said.

Gamified DMHIs are like other video games, in that players advance in levels on digital platforms and are rewarded for progress. But they’re created specifically to target certain mental health conditions.

An ADHD game, for example, might involve users completing activities that require an increasing degree of attention. Games focused on depression might incorporate mindfulness and meditation practices or cognitive behavioral elements.

Experts in child psychiatry are involved in developing such games along with professionals in business and video game technology, said Dr. Bryant.

But the question is: Do these games really work?
 

Effective for ADHD, Depression

Investigators reviewed nearly 30 randomized controlled trials of gamified DMHIs as a treatment for anxiety, depression, and/or ADHD in people younger than 18 years that were published from January 1, 1990, to April 7, 2023.

The trials tested a wide variety of gamified DMHIs that fit the inclusion criteria: A control condition, a digital game intervention, sufficient data to calculate effect size, and available in English.

A meta-analysis was performed to examine the therapeutic effects of the gamified DMHIs for ADHD, depression, and anxiety. For all studies, the active treatment was compared with the control condition using Hedges’ g to measure effect size and 95% CIs.

Dr. Bryant noted there was significant heterogeneity of therapeutic effects between the studies and their corresponding gamified interventions.

The study found gamified DMHIs had a modest therapeutic effect for treating ADHD (pooled g = 0.280; P = .005) and depression (pooled g = 0.279; P = .005) in children and adolescents.

But games targeting anxiety didn’t seem to have the same positive impact (pooled g = 0.074; P = .197).

The results suggest the games “show potential and promise” for certain mental health conditions and could offer a “bridge” to accessing more traditional therapies, Dr. Bryant said.

“Maybe this is something that can help these children until they can get to see a psychiatrist, or it could be part of a comprehensive treatment plan,” he said.

The goal is to “make something that kids want to play and engage with” especially if they’re reluctant to sit in a therapist’s office.

The results provide clinicians with information they can actually use in their practices, said Dr. Bryant, adding that his team hopes to get their study published.
 

 

 

Gaining Traction

Commenting on the research, James Sherer, MD, medical director, Addiction Psychiatry, Overlook Medical Center, Atlantic Health System, said the study shows the literature supports video games, and these games “are gaining traction” in the field.

He noted the app for one such game, EndeavorRx, was one of the first to be approved by the US Food and Drug Administration (FDA) to treat ADHD in young people aged 8-17 years.

EndeavorRx challenges players to chase mystic creatures, race through different worlds, and use “boosts” to problem-solve while building their own universe, according to the company website.

By being incentivized to engage in certain activities, “there’s a level of executive functioning that’s being exercised and the idea is to do that repetitively,” said Dr. Sherer.

Users and their parents report improved ADHD symptoms after playing the game. One of the studies included in the review found 73% of children who played EndeavorRx reported improvement in their attention.

The company says there have been no serious adverse events seen in any clinical trial of EndeavorRx.

Dr. Sherer noted that many child psychiatrists play some sort of video game with their young patients who may be on the autism spectrum or have a learning disability.

“That may be one of the few ways to communicate with and effectively bond with the patient,” he said.

Despite their reputation of being violent and associated with “toxic subcultures,” video games can do a lot of good and be “restorative” for patients of all ages, Dr. Sherer added.

No relevant conflicts of interest were disclosed.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM APA 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

New mRNA Vaccines in Development for Cancer and Infections

Article Type
Changed

BERLIN — To date, mRNA vaccines have had their largest global presence in combating the COVID-19 pandemic. Intensive research is underway on many other potential applications for this vaccine technology, which suggests a promising future. Martina Prelog, MD, a pediatric and adolescent medicine specialist at the University Hospital of Würzburg in Germany, reported on the principles, research status, and perspectives for these vaccines at the 25th Travel and Health Forum of the Center for Travel Medicine in Berlin.

To understand the future, the immunologist first examined the past. “The induction of cellular and humoral immune responses by externally injected mRNA was discovered in the 1990s,” she said.
 

Instability Challenge

Significant hurdles in mRNA vaccinations included the instability of mRNA and the immune system’s ability to identify foreign mRNA as a threat and destroy mRNA fragments. “The breakthrough toward vaccination came through Dr. Katalin Karikó, who, along with Dr. Drew Weissman, both of the University of Pennsylvania School of Medicine, discovered in 2005 that modifications of mRNA (replacing the nucleoside uridine with pseudouridine) enable better stability of mRNA, reduced immunogenicity, and higher translational capacity at the ribosomes,” said Dr. Prelog.

With this discovery, the two researchers paved the way for the development of mRNA vaccines against COVID-19 and other diseases. They were awarded the Nobel Prize in medicine for their discovery last year.
 

Improved Scalability

“Since 2009, mRNA vaccines have been studied as a treatment option for cancer,” said Dr. Prelog. “Since 2012, they have been studied for the influenza virus and respiratory syncytial virus [RSV].” Consequently, several mRNA vaccines are currently in development or in approval studies. “The mRNA technology offers the advantage of quickly and flexibly responding to new variants of pathogens and the ability to scale up production when there is high demand for a particular vaccine.”

Different forms and designations of mRNA vaccines are used, depending on the application and desired effect, said Dr. Prelog.

In nucleoside-modified mRNA vaccines, modifications in the mRNA sequence enable the mRNA to remain in the body longer and to induce protein synthesis more effectively.

Lipid nanoparticle (LNP)–encapsulated mRNA vaccines protect the coding mRNA sequences against degradation by the body’s enzymes and facilitate the uptake of mRNA into cells, where it then triggers the production of the desired protein. In addition, LNPs are involved in cell stimulation and support the self-adjuvant effect of mRNA vaccines, thus eliminating the need for adjuvants.

Self-amplifying mRNA vaccines include a special mRNA that replicates itself in the cell and contains a sequence for RNA replicase, in addition to the coding sequence for the protein. This composition enables increased production of the target protein without the need for a high amount of external mRNA administration. Such vaccines could trigger a longer and stronger immune response because the immune system has more time to interact with the protein.
 

Cancer Immunotherapy

Dr. Prelog also discussed personalized vaccines for cancer immunotherapy. Personalized mRNA vaccines are tailored to the patient’s genetic characteristics and antigens. They could be used in cancer immunotherapy to activate the immune system selectively against tumor cells.

Multivalent mRNA vaccines contain mRNA that codes for multiple antigens rather than just one protein to generate an immune response. These vaccines could be particularly useful in fighting pathogens with variable or changing surface structures or in eliciting protection against multiple pathogens simultaneously.

The technology of mRNA-encoded antibodies involves introducing mRNA into the cell, which creates light and heavy chains of antibodies. This step leads to the formation of antibodies targeted against toxins (eg, diphtheria and tetanus), animal venoms, infectious agents, or tumor cells.
 

Genetic Engineering

Dr. Prelog also reviewed genetic engineering techniques. In regenerative therapy or protein replacement therapy, skin fibroblasts or other cells are transfected with mRNA to enable conversion into induced pluripotent stem cells. This approach avoids the risk for DNA integration into the genome and associated mutation risks.

Another approach is making post-transcriptional modifications through RNA interference. For example, RNA structures can be used to inhibit the translation of disease-causing proteins. This technique is currently being tested against HIV and tumors such as melanoma.

In addition, mRNA technologies can be combined with CRISPR/Cas9 technology (“gene scissors”) to influence the creation of gene products even more precisely. The advantage of this technique is that mRNA is only transiently expressed, thus preventing unwanted side effects. Furthermore, mRNA is translated directly in the cytoplasm, leading to a faster initiation of gene editing.

Of the numerous ongoing clinical mRNA vaccine studies, around 70% focus on infections, about 12% on cancer, and the rest on autoimmune diseases and neurodegenerative disorders, said Dr. Prelog.
 

Research in Infections

Research in the fields of infectious diseases and oncology is the most advanced: mRNA vaccines against influenza and RSV are already in advanced clinical trials, Dr. Prelog told this news organization.

“Conventional influenza vaccines contain immunogenic surface molecules against hemagglutinin and neuraminidase in various combinations of influenza strains A and B and are produced in egg or cell cultures,” she said. “This is a time-consuming manufacturing process that takes months and, particularly with the egg-based process, bears the risk of changing the vaccine strain.”

“Additionally, influenza viruses undergo antigenic shift and drift through recombination, thus requiring annual adjustments to the vaccines. Thus, these influenza vaccines often lose accuracy in targeting circulating seasonal influenza strains.”

Several mRNA vaccines being tested contain not only coding sequences against hemagglutinin and neuraminidase but also for structural proteins of influenza viruses. “These are more conserved and mutate less easily, meaning they could serve as the basis for universal pandemic influenza vaccines,” said Dr. Prelog.

An advantage of mRNA vaccines, she added, is the strong cellular immune response that they elicit. This response is intended to provide additional protection alongside specific antibodies. An mRNA vaccine with coding sequences for the pre-fusion protein of RSV is in phase 3 trials for approval for vaccination in patients aged 60 years and older. It shows high effectiveness even in older patients and those with comorbidities.
 

Elaborate Purification Process

Bacterial origin plasmid DNA is used to produce mRNA vaccines. The mRNA vaccines for COVID-19 raised concerns that production-related DNA residues could pose a safety risk and cause autoimmune diseases.

These vaccines “typically undergo a very elaborate purification process,” said Dr. Prelog. “This involves enzymatic digestion with DNase to fragment and deplete plasmid DNA, followed by purification using chromatography columns, so that no safety-relevant DNA fragments should remain afterward.”

Thus, the Paul-Ehrlich-Institut also pointed out the very small, fragmented plasmid DNA residues of bacterial origin in mRNA COVID-19 vaccines pose no risk, unlike residual DNA from animal cell culture might pose in other vaccines.
 

Prevention and Therapy

In addition to the numerous advantages of mRNA vaccines (such as rapid adaptability to new or mutated pathogens, scalability, rapid production capability, self-adjuvant effect, strong induction of cellular immune responses, and safety), there are also challenges in RNA technology as a preventive and therapeutic measure, according to Dr. Prelog.

“Stability and storability, as well as the costs of new vaccine developments, play a role, as do the long-term effects regarding the persistence of antibody and cellular responses,” she said. The COVID-19 mRNA vaccines, for example, showed a well-maintained cellular immune response despite a tendency toward a rapid decline in humoral immune response.

“The experience with COVID-19 mRNA vaccines and the new vaccine developments based on mRNA technology give hope for an efficient and safe preventive and therapeutic use, particularly in the fields of infectious diseases and oncology,” Dr. Prelog concluded.

This story was translated from the Medscape German edition using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

Publications
Topics
Sections

BERLIN — To date, mRNA vaccines have had their largest global presence in combating the COVID-19 pandemic. Intensive research is underway on many other potential applications for this vaccine technology, which suggests a promising future. Martina Prelog, MD, a pediatric and adolescent medicine specialist at the University Hospital of Würzburg in Germany, reported on the principles, research status, and perspectives for these vaccines at the 25th Travel and Health Forum of the Center for Travel Medicine in Berlin.

To understand the future, the immunologist first examined the past. “The induction of cellular and humoral immune responses by externally injected mRNA was discovered in the 1990s,” she said.
 

Instability Challenge

Significant hurdles in mRNA vaccinations included the instability of mRNA and the immune system’s ability to identify foreign mRNA as a threat and destroy mRNA fragments. “The breakthrough toward vaccination came through Dr. Katalin Karikó, who, along with Dr. Drew Weissman, both of the University of Pennsylvania School of Medicine, discovered in 2005 that modifications of mRNA (replacing the nucleoside uridine with pseudouridine) enable better stability of mRNA, reduced immunogenicity, and higher translational capacity at the ribosomes,” said Dr. Prelog.

With this discovery, the two researchers paved the way for the development of mRNA vaccines against COVID-19 and other diseases. They were awarded the Nobel Prize in medicine for their discovery last year.
 

Improved Scalability

“Since 2009, mRNA vaccines have been studied as a treatment option for cancer,” said Dr. Prelog. “Since 2012, they have been studied for the influenza virus and respiratory syncytial virus [RSV].” Consequently, several mRNA vaccines are currently in development or in approval studies. “The mRNA technology offers the advantage of quickly and flexibly responding to new variants of pathogens and the ability to scale up production when there is high demand for a particular vaccine.”

Different forms and designations of mRNA vaccines are used, depending on the application and desired effect, said Dr. Prelog.

In nucleoside-modified mRNA vaccines, modifications in the mRNA sequence enable the mRNA to remain in the body longer and to induce protein synthesis more effectively.

Lipid nanoparticle (LNP)–encapsulated mRNA vaccines protect the coding mRNA sequences against degradation by the body’s enzymes and facilitate the uptake of mRNA into cells, where it then triggers the production of the desired protein. In addition, LNPs are involved in cell stimulation and support the self-adjuvant effect of mRNA vaccines, thus eliminating the need for adjuvants.

Self-amplifying mRNA vaccines include a special mRNA that replicates itself in the cell and contains a sequence for RNA replicase, in addition to the coding sequence for the protein. This composition enables increased production of the target protein without the need for a high amount of external mRNA administration. Such vaccines could trigger a longer and stronger immune response because the immune system has more time to interact with the protein.
 

Cancer Immunotherapy

Dr. Prelog also discussed personalized vaccines for cancer immunotherapy. Personalized mRNA vaccines are tailored to the patient’s genetic characteristics and antigens. They could be used in cancer immunotherapy to activate the immune system selectively against tumor cells.

Multivalent mRNA vaccines contain mRNA that codes for multiple antigens rather than just one protein to generate an immune response. These vaccines could be particularly useful in fighting pathogens with variable or changing surface structures or in eliciting protection against multiple pathogens simultaneously.

The technology of mRNA-encoded antibodies involves introducing mRNA into the cell, which creates light and heavy chains of antibodies. This step leads to the formation of antibodies targeted against toxins (eg, diphtheria and tetanus), animal venoms, infectious agents, or tumor cells.
 

Genetic Engineering

Dr. Prelog also reviewed genetic engineering techniques. In regenerative therapy or protein replacement therapy, skin fibroblasts or other cells are transfected with mRNA to enable conversion into induced pluripotent stem cells. This approach avoids the risk for DNA integration into the genome and associated mutation risks.

Another approach is making post-transcriptional modifications through RNA interference. For example, RNA structures can be used to inhibit the translation of disease-causing proteins. This technique is currently being tested against HIV and tumors such as melanoma.

In addition, mRNA technologies can be combined with CRISPR/Cas9 technology (“gene scissors”) to influence the creation of gene products even more precisely. The advantage of this technique is that mRNA is only transiently expressed, thus preventing unwanted side effects. Furthermore, mRNA is translated directly in the cytoplasm, leading to a faster initiation of gene editing.

Of the numerous ongoing clinical mRNA vaccine studies, around 70% focus on infections, about 12% on cancer, and the rest on autoimmune diseases and neurodegenerative disorders, said Dr. Prelog.
 

Research in Infections

Research in the fields of infectious diseases and oncology is the most advanced: mRNA vaccines against influenza and RSV are already in advanced clinical trials, Dr. Prelog told this news organization.

“Conventional influenza vaccines contain immunogenic surface molecules against hemagglutinin and neuraminidase in various combinations of influenza strains A and B and are produced in egg or cell cultures,” she said. “This is a time-consuming manufacturing process that takes months and, particularly with the egg-based process, bears the risk of changing the vaccine strain.”

“Additionally, influenza viruses undergo antigenic shift and drift through recombination, thus requiring annual adjustments to the vaccines. Thus, these influenza vaccines often lose accuracy in targeting circulating seasonal influenza strains.”

Several mRNA vaccines being tested contain not only coding sequences against hemagglutinin and neuraminidase but also for structural proteins of influenza viruses. “These are more conserved and mutate less easily, meaning they could serve as the basis for universal pandemic influenza vaccines,” said Dr. Prelog.

An advantage of mRNA vaccines, she added, is the strong cellular immune response that they elicit. This response is intended to provide additional protection alongside specific antibodies. An mRNA vaccine with coding sequences for the pre-fusion protein of RSV is in phase 3 trials for approval for vaccination in patients aged 60 years and older. It shows high effectiveness even in older patients and those with comorbidities.
 

Elaborate Purification Process

Bacterial origin plasmid DNA is used to produce mRNA vaccines. The mRNA vaccines for COVID-19 raised concerns that production-related DNA residues could pose a safety risk and cause autoimmune diseases.

These vaccines “typically undergo a very elaborate purification process,” said Dr. Prelog. “This involves enzymatic digestion with DNase to fragment and deplete plasmid DNA, followed by purification using chromatography columns, so that no safety-relevant DNA fragments should remain afterward.”

Thus, the Paul-Ehrlich-Institut also pointed out the very small, fragmented plasmid DNA residues of bacterial origin in mRNA COVID-19 vaccines pose no risk, unlike residual DNA from animal cell culture might pose in other vaccines.
 

Prevention and Therapy

In addition to the numerous advantages of mRNA vaccines (such as rapid adaptability to new or mutated pathogens, scalability, rapid production capability, self-adjuvant effect, strong induction of cellular immune responses, and safety), there are also challenges in RNA technology as a preventive and therapeutic measure, according to Dr. Prelog.

“Stability and storability, as well as the costs of new vaccine developments, play a role, as do the long-term effects regarding the persistence of antibody and cellular responses,” she said. The COVID-19 mRNA vaccines, for example, showed a well-maintained cellular immune response despite a tendency toward a rapid decline in humoral immune response.

“The experience with COVID-19 mRNA vaccines and the new vaccine developments based on mRNA technology give hope for an efficient and safe preventive and therapeutic use, particularly in the fields of infectious diseases and oncology,” Dr. Prelog concluded.

This story was translated from the Medscape German edition using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

BERLIN — To date, mRNA vaccines have had their largest global presence in combating the COVID-19 pandemic. Intensive research is underway on many other potential applications for this vaccine technology, which suggests a promising future. Martina Prelog, MD, a pediatric and adolescent medicine specialist at the University Hospital of Würzburg in Germany, reported on the principles, research status, and perspectives for these vaccines at the 25th Travel and Health Forum of the Center for Travel Medicine in Berlin.

To understand the future, the immunologist first examined the past. “The induction of cellular and humoral immune responses by externally injected mRNA was discovered in the 1990s,” she said.
 

Instability Challenge

Significant hurdles in mRNA vaccinations included the instability of mRNA and the immune system’s ability to identify foreign mRNA as a threat and destroy mRNA fragments. “The breakthrough toward vaccination came through Dr. Katalin Karikó, who, along with Dr. Drew Weissman, both of the University of Pennsylvania School of Medicine, discovered in 2005 that modifications of mRNA (replacing the nucleoside uridine with pseudouridine) enable better stability of mRNA, reduced immunogenicity, and higher translational capacity at the ribosomes,” said Dr. Prelog.

With this discovery, the two researchers paved the way for the development of mRNA vaccines against COVID-19 and other diseases. They were awarded the Nobel Prize in medicine for their discovery last year.
 

Improved Scalability

“Since 2009, mRNA vaccines have been studied as a treatment option for cancer,” said Dr. Prelog. “Since 2012, they have been studied for the influenza virus and respiratory syncytial virus [RSV].” Consequently, several mRNA vaccines are currently in development or in approval studies. “The mRNA technology offers the advantage of quickly and flexibly responding to new variants of pathogens and the ability to scale up production when there is high demand for a particular vaccine.”

Different forms and designations of mRNA vaccines are used, depending on the application and desired effect, said Dr. Prelog.

In nucleoside-modified mRNA vaccines, modifications in the mRNA sequence enable the mRNA to remain in the body longer and to induce protein synthesis more effectively.

Lipid nanoparticle (LNP)–encapsulated mRNA vaccines protect the coding mRNA sequences against degradation by the body’s enzymes and facilitate the uptake of mRNA into cells, where it then triggers the production of the desired protein. In addition, LNPs are involved in cell stimulation and support the self-adjuvant effect of mRNA vaccines, thus eliminating the need for adjuvants.

Self-amplifying mRNA vaccines include a special mRNA that replicates itself in the cell and contains a sequence for RNA replicase, in addition to the coding sequence for the protein. This composition enables increased production of the target protein without the need for a high amount of external mRNA administration. Such vaccines could trigger a longer and stronger immune response because the immune system has more time to interact with the protein.
 

Cancer Immunotherapy

Dr. Prelog also discussed personalized vaccines for cancer immunotherapy. Personalized mRNA vaccines are tailored to the patient’s genetic characteristics and antigens. They could be used in cancer immunotherapy to activate the immune system selectively against tumor cells.

Multivalent mRNA vaccines contain mRNA that codes for multiple antigens rather than just one protein to generate an immune response. These vaccines could be particularly useful in fighting pathogens with variable or changing surface structures or in eliciting protection against multiple pathogens simultaneously.

The technology of mRNA-encoded antibodies involves introducing mRNA into the cell, which creates light and heavy chains of antibodies. This step leads to the formation of antibodies targeted against toxins (eg, diphtheria and tetanus), animal venoms, infectious agents, or tumor cells.
 

Genetic Engineering

Dr. Prelog also reviewed genetic engineering techniques. In regenerative therapy or protein replacement therapy, skin fibroblasts or other cells are transfected with mRNA to enable conversion into induced pluripotent stem cells. This approach avoids the risk for DNA integration into the genome and associated mutation risks.

Another approach is making post-transcriptional modifications through RNA interference. For example, RNA structures can be used to inhibit the translation of disease-causing proteins. This technique is currently being tested against HIV and tumors such as melanoma.

In addition, mRNA technologies can be combined with CRISPR/Cas9 technology (“gene scissors”) to influence the creation of gene products even more precisely. The advantage of this technique is that mRNA is only transiently expressed, thus preventing unwanted side effects. Furthermore, mRNA is translated directly in the cytoplasm, leading to a faster initiation of gene editing.

Of the numerous ongoing clinical mRNA vaccine studies, around 70% focus on infections, about 12% on cancer, and the rest on autoimmune diseases and neurodegenerative disorders, said Dr. Prelog.
 

Research in Infections

Research in the fields of infectious diseases and oncology is the most advanced: mRNA vaccines against influenza and RSV are already in advanced clinical trials, Dr. Prelog told this news organization.

“Conventional influenza vaccines contain immunogenic surface molecules against hemagglutinin and neuraminidase in various combinations of influenza strains A and B and are produced in egg or cell cultures,” she said. “This is a time-consuming manufacturing process that takes months and, particularly with the egg-based process, bears the risk of changing the vaccine strain.”

“Additionally, influenza viruses undergo antigenic shift and drift through recombination, thus requiring annual adjustments to the vaccines. Thus, these influenza vaccines often lose accuracy in targeting circulating seasonal influenza strains.”

Several mRNA vaccines being tested contain not only coding sequences against hemagglutinin and neuraminidase but also for structural proteins of influenza viruses. “These are more conserved and mutate less easily, meaning they could serve as the basis for universal pandemic influenza vaccines,” said Dr. Prelog.

An advantage of mRNA vaccines, she added, is the strong cellular immune response that they elicit. This response is intended to provide additional protection alongside specific antibodies. An mRNA vaccine with coding sequences for the pre-fusion protein of RSV is in phase 3 trials for approval for vaccination in patients aged 60 years and older. It shows high effectiveness even in older patients and those with comorbidities.
 

Elaborate Purification Process

Bacterial origin plasmid DNA is used to produce mRNA vaccines. The mRNA vaccines for COVID-19 raised concerns that production-related DNA residues could pose a safety risk and cause autoimmune diseases.

These vaccines “typically undergo a very elaborate purification process,” said Dr. Prelog. “This involves enzymatic digestion with DNase to fragment and deplete plasmid DNA, followed by purification using chromatography columns, so that no safety-relevant DNA fragments should remain afterward.”

Thus, the Paul-Ehrlich-Institut also pointed out the very small, fragmented plasmid DNA residues of bacterial origin in mRNA COVID-19 vaccines pose no risk, unlike residual DNA from animal cell culture might pose in other vaccines.
 

Prevention and Therapy

In addition to the numerous advantages of mRNA vaccines (such as rapid adaptability to new or mutated pathogens, scalability, rapid production capability, self-adjuvant effect, strong induction of cellular immune responses, and safety), there are also challenges in RNA technology as a preventive and therapeutic measure, according to Dr. Prelog.

“Stability and storability, as well as the costs of new vaccine developments, play a role, as do the long-term effects regarding the persistence of antibody and cellular responses,” she said. The COVID-19 mRNA vaccines, for example, showed a well-maintained cellular immune response despite a tendency toward a rapid decline in humoral immune response.

“The experience with COVID-19 mRNA vaccines and the new vaccine developments based on mRNA technology give hope for an efficient and safe preventive and therapeutic use, particularly in the fields of infectious diseases and oncology,” Dr. Prelog concluded.

This story was translated from the Medscape German edition using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Rural Health System ‘Teetering on Brink’ of Collapse, Says AMA

Article Type
Changed

Physicians are leaving healthcare in droves, “not because they don’t want to practice ... but because the system is making it more and more difficult for them to care for their patients,” Bruce Scott, MD, president-elect of the American Medical Association (AMA), said at a press conference May 9 at the National Rural Health Association’s Annual Conference in New Orleans. 

He said that shrinking reimbursement rates and excessive administrative tasks are pushing doctors out of the workforce, exacerbating physician shortages in rural locations where 46 million Americans live. 

Rural areas have about one tenth of the specialists that urban areas do, and 65% of rural communities do not have enough primary care doctors, according to federal data. A recent Centers for Disease Control and Prevention report found that people living in rural areas are more likely to die early from preventable causes than their urban counterparts, said Dr. Scott. 

He said the AMA wants Congress to pass legislation to incentivize more physicians to work in rural areas and expand the number of rural and primary care residency spots. Historically, 80% of residents practice within 80 miles of where they complete residency, he said. 

Dr. Scott also hopes Congress will revise the J-1 visa rules to allow qualified international medical graduates to continue to practice in the United States. He’d like to see the pandemic telehealth flexibilities made permanent because these loosened guidelines greatly improved care access for rural areas in recent years. 

Lower Pay Affects Care in Rural, Urban Areas

Decreased reimbursements also have hit rural and urban doctors in independent practice particularly hard, Dr. Scott said. When adjusted for inflation, the current Medicare payment rate for physicians has dropped 29% since 2001, he said. Now that commercial payers tie their reimbursement models to the Medicare rate, physicians are experiencing “severe” financial stress amid rising practice costs and student loan debt. 

He shared anecdotes about how these issues have affected his private otolaryngology practice in Louisville, Kentucky, a state where more than 2 million people live in federally designated primary care professional shortage areas. 

“A major insurance company that controls over 60% of the private payer market in rural Kentucky [recently] offered us ... surgical rates less than they paid us 6 years ago,” he said. 

Dr. Scott said physicians must make difficult choices. “Do we not invest in the latest physical equipment? Do we reduce our number of employees? Do we perhaps stop accepting new Medicare patients?”

He noted that physicians now spend twice as much time on prior authorizations and other administrative tasks as they do on direct patient care. According to a 2022 AMA survey, 33% of physicians reported that the cumbersome prior authorization process led to a serious adverse event for a patient. Eighty percent reported it caused their patient to forgo treatment altogether.

Dr. Scott, who will be sworn in as AMA president in June, said he experiences the frustration daily. 

“I have to get on the phone and justify to an insurance person who rarely has gone to medical school, has never seen the patient, and heck, in my case, sometimes they can’t even say otolaryngology, much less tell me what the appropriate care is for my patient,” he said.

When asked about the impact of private equity in healthcare, Dr. Scott said there is room for all different modes of practice, but private equity could bring a unique benefit. 

“They have deeper pockets to potentially invest in telehealth technology, AI, and better computer systems,” he said. 

But, he said, some private equity-owned systems have abandoned rural areas, and in other regions they “push the physicians to move faster, see more patients, and do the things that are profit-driven.

“The key is to continue to provide ... quality medical care that is determined by an individual physician in consultation with the patient.”
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Physicians are leaving healthcare in droves, “not because they don’t want to practice ... but because the system is making it more and more difficult for them to care for their patients,” Bruce Scott, MD, president-elect of the American Medical Association (AMA), said at a press conference May 9 at the National Rural Health Association’s Annual Conference in New Orleans. 

He said that shrinking reimbursement rates and excessive administrative tasks are pushing doctors out of the workforce, exacerbating physician shortages in rural locations where 46 million Americans live. 

Rural areas have about one tenth of the specialists that urban areas do, and 65% of rural communities do not have enough primary care doctors, according to federal data. A recent Centers for Disease Control and Prevention report found that people living in rural areas are more likely to die early from preventable causes than their urban counterparts, said Dr. Scott. 

He said the AMA wants Congress to pass legislation to incentivize more physicians to work in rural areas and expand the number of rural and primary care residency spots. Historically, 80% of residents practice within 80 miles of where they complete residency, he said. 

Dr. Scott also hopes Congress will revise the J-1 visa rules to allow qualified international medical graduates to continue to practice in the United States. He’d like to see the pandemic telehealth flexibilities made permanent because these loosened guidelines greatly improved care access for rural areas in recent years. 

Lower Pay Affects Care in Rural, Urban Areas

Decreased reimbursements also have hit rural and urban doctors in independent practice particularly hard, Dr. Scott said. When adjusted for inflation, the current Medicare payment rate for physicians has dropped 29% since 2001, he said. Now that commercial payers tie their reimbursement models to the Medicare rate, physicians are experiencing “severe” financial stress amid rising practice costs and student loan debt. 

He shared anecdotes about how these issues have affected his private otolaryngology practice in Louisville, Kentucky, a state where more than 2 million people live in federally designated primary care professional shortage areas. 

“A major insurance company that controls over 60% of the private payer market in rural Kentucky [recently] offered us ... surgical rates less than they paid us 6 years ago,” he said. 

Dr. Scott said physicians must make difficult choices. “Do we not invest in the latest physical equipment? Do we reduce our number of employees? Do we perhaps stop accepting new Medicare patients?”

He noted that physicians now spend twice as much time on prior authorizations and other administrative tasks as they do on direct patient care. According to a 2022 AMA survey, 33% of physicians reported that the cumbersome prior authorization process led to a serious adverse event for a patient. Eighty percent reported it caused their patient to forgo treatment altogether.

Dr. Scott, who will be sworn in as AMA president in June, said he experiences the frustration daily. 

“I have to get on the phone and justify to an insurance person who rarely has gone to medical school, has never seen the patient, and heck, in my case, sometimes they can’t even say otolaryngology, much less tell me what the appropriate care is for my patient,” he said.

When asked about the impact of private equity in healthcare, Dr. Scott said there is room for all different modes of practice, but private equity could bring a unique benefit. 

“They have deeper pockets to potentially invest in telehealth technology, AI, and better computer systems,” he said. 

But, he said, some private equity-owned systems have abandoned rural areas, and in other regions they “push the physicians to move faster, see more patients, and do the things that are profit-driven.

“The key is to continue to provide ... quality medical care that is determined by an individual physician in consultation with the patient.”
 

A version of this article appeared on Medscape.com.

Physicians are leaving healthcare in droves, “not because they don’t want to practice ... but because the system is making it more and more difficult for them to care for their patients,” Bruce Scott, MD, president-elect of the American Medical Association (AMA), said at a press conference May 9 at the National Rural Health Association’s Annual Conference in New Orleans. 

He said that shrinking reimbursement rates and excessive administrative tasks are pushing doctors out of the workforce, exacerbating physician shortages in rural locations where 46 million Americans live. 

Rural areas have about one tenth of the specialists that urban areas do, and 65% of rural communities do not have enough primary care doctors, according to federal data. A recent Centers for Disease Control and Prevention report found that people living in rural areas are more likely to die early from preventable causes than their urban counterparts, said Dr. Scott. 

He said the AMA wants Congress to pass legislation to incentivize more physicians to work in rural areas and expand the number of rural and primary care residency spots. Historically, 80% of residents practice within 80 miles of where they complete residency, he said. 

Dr. Scott also hopes Congress will revise the J-1 visa rules to allow qualified international medical graduates to continue to practice in the United States. He’d like to see the pandemic telehealth flexibilities made permanent because these loosened guidelines greatly improved care access for rural areas in recent years. 

Lower Pay Affects Care in Rural, Urban Areas

Decreased reimbursements also have hit rural and urban doctors in independent practice particularly hard, Dr. Scott said. When adjusted for inflation, the current Medicare payment rate for physicians has dropped 29% since 2001, he said. Now that commercial payers tie their reimbursement models to the Medicare rate, physicians are experiencing “severe” financial stress amid rising practice costs and student loan debt. 

He shared anecdotes about how these issues have affected his private otolaryngology practice in Louisville, Kentucky, a state where more than 2 million people live in federally designated primary care professional shortage areas. 

“A major insurance company that controls over 60% of the private payer market in rural Kentucky [recently] offered us ... surgical rates less than they paid us 6 years ago,” he said. 

Dr. Scott said physicians must make difficult choices. “Do we not invest in the latest physical equipment? Do we reduce our number of employees? Do we perhaps stop accepting new Medicare patients?”

He noted that physicians now spend twice as much time on prior authorizations and other administrative tasks as they do on direct patient care. According to a 2022 AMA survey, 33% of physicians reported that the cumbersome prior authorization process led to a serious adverse event for a patient. Eighty percent reported it caused their patient to forgo treatment altogether.

Dr. Scott, who will be sworn in as AMA president in June, said he experiences the frustration daily. 

“I have to get on the phone and justify to an insurance person who rarely has gone to medical school, has never seen the patient, and heck, in my case, sometimes they can’t even say otolaryngology, much less tell me what the appropriate care is for my patient,” he said.

When asked about the impact of private equity in healthcare, Dr. Scott said there is room for all different modes of practice, but private equity could bring a unique benefit. 

“They have deeper pockets to potentially invest in telehealth technology, AI, and better computer systems,” he said. 

But, he said, some private equity-owned systems have abandoned rural areas, and in other regions they “push the physicians to move faster, see more patients, and do the things that are profit-driven.

“The key is to continue to provide ... quality medical care that is determined by an individual physician in consultation with the patient.”
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Can a Risk Score Predict Kidney Injury After Cisplatin?

Article Type
Changed

Cisplatin is a preferred treatment for a wide range of cancers, including breast, head and neck, lung, ovary, and more. However, its side effects — particularly nephrotoxicity — can be severe. Kidney injury on cisplatin is associated with higher mortality and can jeopardize a patient’s eligibility for other therapies.

Now, in a large study using data from six US cancer centers, researchers have developed a risk algorithm to predict acute kidney injury (AKI) after cisplatin administration.

risk prediction calculator based on the algorithm is available online for patients and providers to determine an individual patient›s risk for kidney injury from cisplatin using readily available clinical data.

Other risk scores and risk prediction models have been developed to help clinicians assess in advance whether a patient might develop AKI after receiving cisplatin, so that more careful monitoring, dose adjustments, or an alternative treatment, if available, might be considered.

However, previous models were limited by factors such as small sample sizes, lack of external validation, older data, and liberal definitions of AKI, said Shruti Gupta, MD, MPH, director of onco-nephrology at Brigham and Women’s Hospital (BWH) and Dana-Farber Cancer Institute, and David E. Leaf, MD, MMSc, director of clinical and translational research in AKI, Division of Renal Medicine, BWH, Boston.

Dr. Gupta and Dr. Leaf believe their risk score for predicting severe AKI after intravenous (IV) cisplatin, published online in The BMJ, is “more accurate and generalizable than prior models for several reasons,” they told this news organization in a joint email.

“First, we externally validated our findings across cancer centers other than the one where it was developed,” they said. “Second, we focused on moderate to severe kidney injury, the most clinically relevant form of kidney damage, whereas prior models examined more mild forms of kidney injury. Third, we collected data on nearly 25,000 patients receiving their first dose of IV cisplatin, which is larger than all previous studies combined.”
 

‘Herculean Effort’

“We conceived of this study back in 2018, contacted collaborators at each participating cancer center, and had numerous meetings to try to gather granular data on patients treated with their first dose of intravenous (IV) cisplatin,” Dr. Gupta and Dr. Leaf explained. They also incorporated patient feedback from focus groups and surveys.

“This was truly a Herculean effort that involved physicians, programmers, research coordinators, and patients,” they said.

The multicenter study included 24,717 patients — 11,766 in the derivation cohort and 12,951 in the validation cohort. Overall, the median age was about 60 years, about 58% were men, and about 78% were White.

The primary outcome was cisplatin-induced AKI (CP-AKI), defined as a twofold or greater increase in serum creatinine or kidney replacement therapy within 14 days of a first dose of IV cisplatin.

The researchers found that the incidence of CP-AKI was 5.2% in the derivation cohort and 3.3% in the validation cohort. Their simple risk score consisting of nine covariates — age, hypertension, type 2 diabetes, hemoglobin level, white blood cell count, platelet count, serum albumin level, serum magnesium level, and cisplatin dose — predicted a higher risk for CP-AKI in both cohorts.

Notably, adding serum creatinine to the model did not change the area under the curve, and therefore, serum creatinine, though also an independent risk factor for CP-AKI, was not included in the score.

Patients in the highest risk category had 24-fold higher odds of CP-AKI in the derivation cohort and close to 18-fold higher odds in the validation cohort than those in the lowest risk category.

The primary model had a C statistic of 0.75 (95% CI, 0.73-0.76) and showed better discrimination for CP-AKI than previously published models, for which the C statistics ranged from 0.60 to 0.68. The first author of a paper on an earlier model, Shveta Motwani, MD, MMSc, of BWH and Dana-Farber Cancer Institute in Boston, is also a coauthor of the new study.

Greater severity of CP-AKI was associated with shorter 90-day survival (adjusted hazard ratio, 4.63; 95% CI, 3.56-6.02) for stage III CP-AKI vs no CP-AKI.
 

 

 

‘Definitive Work’

Joel M. Topf, MD, a nephrologist with expertise in chronic kidney disease in Detroit, who wasn’t involved in the development of the risk score, called the study “a definitive work on an important concept in oncology and nephrology.”

“While this is not the first attempt to devise a risk score, it is by far the biggest,” he told this news organization. Furthermore, the authors “used a diverse population, recruiting patients with a variety of cancers (previous attempts had often used a homogenous diagnosis, putting into question how generalizable the results were) from six different cancer centers.”

In addition, he said, “The authors did not restrict patients with chronic kidney disease or other significant comorbidities and used the geographic diversity to produce a cohort that has an age, gender, racial, and ethnic distribution, which is more representative of the US than previous, single-center attempts to risk score patients.”

An earlier model used the Kidney Disease: Improving Global Outcomes (KDIGO) consensus definition of AKI of an increase in serum creatinine of 0.3 mg/dL, he noted. “While a sensitive definition of AKI, it captures mild, hemodynamic increases in creatinine of questionable significance,” he said.

By contrast, the new score uses KDIGO stage II and above to define AKI. “This is a better choice, as we do not want to dissuade patients and doctors from choosing chemotherapy due to a fear of insignificant kidney damage,” he said.

All that said, Dr. Topf noted that neither the current score nor the earlier model included serum creatinine. “This is curious to me and may represent the small number of patients with representative elevated creatinine in the derivation cohort (only 1.3% with an estimated glomerular filtration rate [eGFR] < 45).”

“Since the cohort is made up of people who received cis-platinum, the low prevalence of eGFRs < 45 may be due to physicians steering away from cis-platinum in this group,” he suggested. “It would be unfortunate if this risk score gave an unintentional ‘green light’ to these patients, exposing them to predictable harm.”
 

‘Certainly Useful’

Anushree Shirali, MD, an associate professor in the Section of Nephrology and consulting physician, Yale Onco-Nephrology, Yale School of Medicine, in New Haven, Connecticut, said that having a prediction score for which patients are more likely to develop AKI after a single dose of cisplatin would be helpful for oncologists, as well as nephrologists.

As a nephrologist, Dr. Shirali mostly sees patients who already have AKI, she told this news organization. But there are circumstances in which the tool could still be helpful.

“Let’s say someone has abnormal kidney function at baseline — ie, creatinine is higher than the normal range — and they were on dialysis 5 years ago for something else, and now, they have cancer and may be given cisplatin. They worry about their chances of getting AKI and needing dialysis again,” she said. “That’s just one scenario in which I might be asked to answer that question and the tool would certainly be useful.”

Other scenarios could include someone who has just one kidney because they donated a kidney for transplant years ago, and now, they have a malignancy and wonder what their actual risk is of getting kidney issues on cisplatin.

Oncologists could use the tool to determine whether a patient should be treated with cisplatin, or if they’re at high risk, whether an alternative that’s not nephrotoxic might be used. By contrast, “if somebody’s low risk and an oncologist thinks cisplatin is the best agent they have, then they might want to go ahead and use it,” Dr. Shirali said.

Future research could take into consideration that CP-AKI is dose dependent, she suggested, because a prediction score that included the number of cisplatin doses could be even more helpful to determine risk. And, even though the derivation and validation cohorts for the new tool are representative of the US population, additional research should also include more racial/ethnic diversity, she said.

Dr. Gupta and Dr. Leaf hope their tool “will be utilized immediately by patients and providers to help predict an individual’s risk of cisplatin-associated kidney damage. It is easy to use, available for free online, and incorporates readily available clinical variables.”

If a patient is at high risk, the clinical team can consider preventive measures such as administering more IV fluids before receiving cisplatin or monitoring kidney function more closely afterward, they suggested.

Dr. Gupta reported research support from the National Institutes of Health (NIH) and the National Institute of Diabetes and Digestive and Kidney Diseases. She also reported research funding from BTG International, GE HealthCare, and AstraZeneca outside the submitted work. She is a member of GlaxoSmithKline’s Global Anemia Council, a consultant for Secretome and Proletariat Therapeutics, and founder and president emeritus of the American Society of Onconephrology (unpaid). Dr. Leaf is supported by NIH grants, reported research support from BioPorto, BTG International, and Metro International Biotech, and has served as a consultant. Dr. Topf reported an ownership stake in a few DaVita-run dialysis clinics. He also runs a vascular access center and has participated in advisory boards with Cara Therapeutics, Vifor, Astra Zeneca, Bayer, Renibus Therapeutics, Travere Therapeutics, and GlaxoSmithKline. He is president of NephJC, a nonprofit educational organization with no industry support. Dr. Shirali declared no competing interests.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Cisplatin is a preferred treatment for a wide range of cancers, including breast, head and neck, lung, ovary, and more. However, its side effects — particularly nephrotoxicity — can be severe. Kidney injury on cisplatin is associated with higher mortality and can jeopardize a patient’s eligibility for other therapies.

Now, in a large study using data from six US cancer centers, researchers have developed a risk algorithm to predict acute kidney injury (AKI) after cisplatin administration.

risk prediction calculator based on the algorithm is available online for patients and providers to determine an individual patient›s risk for kidney injury from cisplatin using readily available clinical data.

Other risk scores and risk prediction models have been developed to help clinicians assess in advance whether a patient might develop AKI after receiving cisplatin, so that more careful monitoring, dose adjustments, or an alternative treatment, if available, might be considered.

However, previous models were limited by factors such as small sample sizes, lack of external validation, older data, and liberal definitions of AKI, said Shruti Gupta, MD, MPH, director of onco-nephrology at Brigham and Women’s Hospital (BWH) and Dana-Farber Cancer Institute, and David E. Leaf, MD, MMSc, director of clinical and translational research in AKI, Division of Renal Medicine, BWH, Boston.

Dr. Gupta and Dr. Leaf believe their risk score for predicting severe AKI after intravenous (IV) cisplatin, published online in The BMJ, is “more accurate and generalizable than prior models for several reasons,” they told this news organization in a joint email.

“First, we externally validated our findings across cancer centers other than the one where it was developed,” they said. “Second, we focused on moderate to severe kidney injury, the most clinically relevant form of kidney damage, whereas prior models examined more mild forms of kidney injury. Third, we collected data on nearly 25,000 patients receiving their first dose of IV cisplatin, which is larger than all previous studies combined.”
 

‘Herculean Effort’

“We conceived of this study back in 2018, contacted collaborators at each participating cancer center, and had numerous meetings to try to gather granular data on patients treated with their first dose of intravenous (IV) cisplatin,” Dr. Gupta and Dr. Leaf explained. They also incorporated patient feedback from focus groups and surveys.

“This was truly a Herculean effort that involved physicians, programmers, research coordinators, and patients,” they said.

The multicenter study included 24,717 patients — 11,766 in the derivation cohort and 12,951 in the validation cohort. Overall, the median age was about 60 years, about 58% were men, and about 78% were White.

The primary outcome was cisplatin-induced AKI (CP-AKI), defined as a twofold or greater increase in serum creatinine or kidney replacement therapy within 14 days of a first dose of IV cisplatin.

The researchers found that the incidence of CP-AKI was 5.2% in the derivation cohort and 3.3% in the validation cohort. Their simple risk score consisting of nine covariates — age, hypertension, type 2 diabetes, hemoglobin level, white blood cell count, platelet count, serum albumin level, serum magnesium level, and cisplatin dose — predicted a higher risk for CP-AKI in both cohorts.

Notably, adding serum creatinine to the model did not change the area under the curve, and therefore, serum creatinine, though also an independent risk factor for CP-AKI, was not included in the score.

Patients in the highest risk category had 24-fold higher odds of CP-AKI in the derivation cohort and close to 18-fold higher odds in the validation cohort than those in the lowest risk category.

The primary model had a C statistic of 0.75 (95% CI, 0.73-0.76) and showed better discrimination for CP-AKI than previously published models, for which the C statistics ranged from 0.60 to 0.68. The first author of a paper on an earlier model, Shveta Motwani, MD, MMSc, of BWH and Dana-Farber Cancer Institute in Boston, is also a coauthor of the new study.

Greater severity of CP-AKI was associated with shorter 90-day survival (adjusted hazard ratio, 4.63; 95% CI, 3.56-6.02) for stage III CP-AKI vs no CP-AKI.
 

 

 

‘Definitive Work’

Joel M. Topf, MD, a nephrologist with expertise in chronic kidney disease in Detroit, who wasn’t involved in the development of the risk score, called the study “a definitive work on an important concept in oncology and nephrology.”

“While this is not the first attempt to devise a risk score, it is by far the biggest,” he told this news organization. Furthermore, the authors “used a diverse population, recruiting patients with a variety of cancers (previous attempts had often used a homogenous diagnosis, putting into question how generalizable the results were) from six different cancer centers.”

In addition, he said, “The authors did not restrict patients with chronic kidney disease or other significant comorbidities and used the geographic diversity to produce a cohort that has an age, gender, racial, and ethnic distribution, which is more representative of the US than previous, single-center attempts to risk score patients.”

An earlier model used the Kidney Disease: Improving Global Outcomes (KDIGO) consensus definition of AKI of an increase in serum creatinine of 0.3 mg/dL, he noted. “While a sensitive definition of AKI, it captures mild, hemodynamic increases in creatinine of questionable significance,” he said.

By contrast, the new score uses KDIGO stage II and above to define AKI. “This is a better choice, as we do not want to dissuade patients and doctors from choosing chemotherapy due to a fear of insignificant kidney damage,” he said.

All that said, Dr. Topf noted that neither the current score nor the earlier model included serum creatinine. “This is curious to me and may represent the small number of patients with representative elevated creatinine in the derivation cohort (only 1.3% with an estimated glomerular filtration rate [eGFR] < 45).”

“Since the cohort is made up of people who received cis-platinum, the low prevalence of eGFRs < 45 may be due to physicians steering away from cis-platinum in this group,” he suggested. “It would be unfortunate if this risk score gave an unintentional ‘green light’ to these patients, exposing them to predictable harm.”
 

‘Certainly Useful’

Anushree Shirali, MD, an associate professor in the Section of Nephrology and consulting physician, Yale Onco-Nephrology, Yale School of Medicine, in New Haven, Connecticut, said that having a prediction score for which patients are more likely to develop AKI after a single dose of cisplatin would be helpful for oncologists, as well as nephrologists.

As a nephrologist, Dr. Shirali mostly sees patients who already have AKI, she told this news organization. But there are circumstances in which the tool could still be helpful.

“Let’s say someone has abnormal kidney function at baseline — ie, creatinine is higher than the normal range — and they were on dialysis 5 years ago for something else, and now, they have cancer and may be given cisplatin. They worry about their chances of getting AKI and needing dialysis again,” she said. “That’s just one scenario in which I might be asked to answer that question and the tool would certainly be useful.”

Other scenarios could include someone who has just one kidney because they donated a kidney for transplant years ago, and now, they have a malignancy and wonder what their actual risk is of getting kidney issues on cisplatin.

Oncologists could use the tool to determine whether a patient should be treated with cisplatin, or if they’re at high risk, whether an alternative that’s not nephrotoxic might be used. By contrast, “if somebody’s low risk and an oncologist thinks cisplatin is the best agent they have, then they might want to go ahead and use it,” Dr. Shirali said.

Future research could take into consideration that CP-AKI is dose dependent, she suggested, because a prediction score that included the number of cisplatin doses could be even more helpful to determine risk. And, even though the derivation and validation cohorts for the new tool are representative of the US population, additional research should also include more racial/ethnic diversity, she said.

Dr. Gupta and Dr. Leaf hope their tool “will be utilized immediately by patients and providers to help predict an individual’s risk of cisplatin-associated kidney damage. It is easy to use, available for free online, and incorporates readily available clinical variables.”

If a patient is at high risk, the clinical team can consider preventive measures such as administering more IV fluids before receiving cisplatin or monitoring kidney function more closely afterward, they suggested.

Dr. Gupta reported research support from the National Institutes of Health (NIH) and the National Institute of Diabetes and Digestive and Kidney Diseases. She also reported research funding from BTG International, GE HealthCare, and AstraZeneca outside the submitted work. She is a member of GlaxoSmithKline’s Global Anemia Council, a consultant for Secretome and Proletariat Therapeutics, and founder and president emeritus of the American Society of Onconephrology (unpaid). Dr. Leaf is supported by NIH grants, reported research support from BioPorto, BTG International, and Metro International Biotech, and has served as a consultant. Dr. Topf reported an ownership stake in a few DaVita-run dialysis clinics. He also runs a vascular access center and has participated in advisory boards with Cara Therapeutics, Vifor, Astra Zeneca, Bayer, Renibus Therapeutics, Travere Therapeutics, and GlaxoSmithKline. He is president of NephJC, a nonprofit educational organization with no industry support. Dr. Shirali declared no competing interests.

A version of this article appeared on Medscape.com.

Cisplatin is a preferred treatment for a wide range of cancers, including breast, head and neck, lung, ovary, and more. However, its side effects — particularly nephrotoxicity — can be severe. Kidney injury on cisplatin is associated with higher mortality and can jeopardize a patient’s eligibility for other therapies.

Now, in a large study using data from six US cancer centers, researchers have developed a risk algorithm to predict acute kidney injury (AKI) after cisplatin administration.

risk prediction calculator based on the algorithm is available online for patients and providers to determine an individual patient›s risk for kidney injury from cisplatin using readily available clinical data.

Other risk scores and risk prediction models have been developed to help clinicians assess in advance whether a patient might develop AKI after receiving cisplatin, so that more careful monitoring, dose adjustments, or an alternative treatment, if available, might be considered.

However, previous models were limited by factors such as small sample sizes, lack of external validation, older data, and liberal definitions of AKI, said Shruti Gupta, MD, MPH, director of onco-nephrology at Brigham and Women’s Hospital (BWH) and Dana-Farber Cancer Institute, and David E. Leaf, MD, MMSc, director of clinical and translational research in AKI, Division of Renal Medicine, BWH, Boston.

Dr. Gupta and Dr. Leaf believe their risk score for predicting severe AKI after intravenous (IV) cisplatin, published online in The BMJ, is “more accurate and generalizable than prior models for several reasons,” they told this news organization in a joint email.

“First, we externally validated our findings across cancer centers other than the one where it was developed,” they said. “Second, we focused on moderate to severe kidney injury, the most clinically relevant form of kidney damage, whereas prior models examined more mild forms of kidney injury. Third, we collected data on nearly 25,000 patients receiving their first dose of IV cisplatin, which is larger than all previous studies combined.”
 

‘Herculean Effort’

“We conceived of this study back in 2018, contacted collaborators at each participating cancer center, and had numerous meetings to try to gather granular data on patients treated with their first dose of intravenous (IV) cisplatin,” Dr. Gupta and Dr. Leaf explained. They also incorporated patient feedback from focus groups and surveys.

“This was truly a Herculean effort that involved physicians, programmers, research coordinators, and patients,” they said.

The multicenter study included 24,717 patients — 11,766 in the derivation cohort and 12,951 in the validation cohort. Overall, the median age was about 60 years, about 58% were men, and about 78% were White.

The primary outcome was cisplatin-induced AKI (CP-AKI), defined as a twofold or greater increase in serum creatinine or kidney replacement therapy within 14 days of a first dose of IV cisplatin.

The researchers found that the incidence of CP-AKI was 5.2% in the derivation cohort and 3.3% in the validation cohort. Their simple risk score consisting of nine covariates — age, hypertension, type 2 diabetes, hemoglobin level, white blood cell count, platelet count, serum albumin level, serum magnesium level, and cisplatin dose — predicted a higher risk for CP-AKI in both cohorts.

Notably, adding serum creatinine to the model did not change the area under the curve, and therefore, serum creatinine, though also an independent risk factor for CP-AKI, was not included in the score.

Patients in the highest risk category had 24-fold higher odds of CP-AKI in the derivation cohort and close to 18-fold higher odds in the validation cohort than those in the lowest risk category.

The primary model had a C statistic of 0.75 (95% CI, 0.73-0.76) and showed better discrimination for CP-AKI than previously published models, for which the C statistics ranged from 0.60 to 0.68. The first author of a paper on an earlier model, Shveta Motwani, MD, MMSc, of BWH and Dana-Farber Cancer Institute in Boston, is also a coauthor of the new study.

Greater severity of CP-AKI was associated with shorter 90-day survival (adjusted hazard ratio, 4.63; 95% CI, 3.56-6.02) for stage III CP-AKI vs no CP-AKI.
 

 

 

‘Definitive Work’

Joel M. Topf, MD, a nephrologist with expertise in chronic kidney disease in Detroit, who wasn’t involved in the development of the risk score, called the study “a definitive work on an important concept in oncology and nephrology.”

“While this is not the first attempt to devise a risk score, it is by far the biggest,” he told this news organization. Furthermore, the authors “used a diverse population, recruiting patients with a variety of cancers (previous attempts had often used a homogenous diagnosis, putting into question how generalizable the results were) from six different cancer centers.”

In addition, he said, “The authors did not restrict patients with chronic kidney disease or other significant comorbidities and used the geographic diversity to produce a cohort that has an age, gender, racial, and ethnic distribution, which is more representative of the US than previous, single-center attempts to risk score patients.”

An earlier model used the Kidney Disease: Improving Global Outcomes (KDIGO) consensus definition of AKI of an increase in serum creatinine of 0.3 mg/dL, he noted. “While a sensitive definition of AKI, it captures mild, hemodynamic increases in creatinine of questionable significance,” he said.

By contrast, the new score uses KDIGO stage II and above to define AKI. “This is a better choice, as we do not want to dissuade patients and doctors from choosing chemotherapy due to a fear of insignificant kidney damage,” he said.

All that said, Dr. Topf noted that neither the current score nor the earlier model included serum creatinine. “This is curious to me and may represent the small number of patients with representative elevated creatinine in the derivation cohort (only 1.3% with an estimated glomerular filtration rate [eGFR] < 45).”

“Since the cohort is made up of people who received cis-platinum, the low prevalence of eGFRs < 45 may be due to physicians steering away from cis-platinum in this group,” he suggested. “It would be unfortunate if this risk score gave an unintentional ‘green light’ to these patients, exposing them to predictable harm.”
 

‘Certainly Useful’

Anushree Shirali, MD, an associate professor in the Section of Nephrology and consulting physician, Yale Onco-Nephrology, Yale School of Medicine, in New Haven, Connecticut, said that having a prediction score for which patients are more likely to develop AKI after a single dose of cisplatin would be helpful for oncologists, as well as nephrologists.

As a nephrologist, Dr. Shirali mostly sees patients who already have AKI, she told this news organization. But there are circumstances in which the tool could still be helpful.

“Let’s say someone has abnormal kidney function at baseline — ie, creatinine is higher than the normal range — and they were on dialysis 5 years ago for something else, and now, they have cancer and may be given cisplatin. They worry about their chances of getting AKI and needing dialysis again,” she said. “That’s just one scenario in which I might be asked to answer that question and the tool would certainly be useful.”

Other scenarios could include someone who has just one kidney because they donated a kidney for transplant years ago, and now, they have a malignancy and wonder what their actual risk is of getting kidney issues on cisplatin.

Oncologists could use the tool to determine whether a patient should be treated with cisplatin, or if they’re at high risk, whether an alternative that’s not nephrotoxic might be used. By contrast, “if somebody’s low risk and an oncologist thinks cisplatin is the best agent they have, then they might want to go ahead and use it,” Dr. Shirali said.

Future research could take into consideration that CP-AKI is dose dependent, she suggested, because a prediction score that included the number of cisplatin doses could be even more helpful to determine risk. And, even though the derivation and validation cohorts for the new tool are representative of the US population, additional research should also include more racial/ethnic diversity, she said.

Dr. Gupta and Dr. Leaf hope their tool “will be utilized immediately by patients and providers to help predict an individual’s risk of cisplatin-associated kidney damage. It is easy to use, available for free online, and incorporates readily available clinical variables.”

If a patient is at high risk, the clinical team can consider preventive measures such as administering more IV fluids before receiving cisplatin or monitoring kidney function more closely afterward, they suggested.

Dr. Gupta reported research support from the National Institutes of Health (NIH) and the National Institute of Diabetes and Digestive and Kidney Diseases. She also reported research funding from BTG International, GE HealthCare, and AstraZeneca outside the submitted work. She is a member of GlaxoSmithKline’s Global Anemia Council, a consultant for Secretome and Proletariat Therapeutics, and founder and president emeritus of the American Society of Onconephrology (unpaid). Dr. Leaf is supported by NIH grants, reported research support from BioPorto, BTG International, and Metro International Biotech, and has served as a consultant. Dr. Topf reported an ownership stake in a few DaVita-run dialysis clinics. He also runs a vascular access center and has participated in advisory boards with Cara Therapeutics, Vifor, Astra Zeneca, Bayer, Renibus Therapeutics, Travere Therapeutics, and GlaxoSmithKline. He is president of NephJC, a nonprofit educational organization with no industry support. Dr. Shirali declared no competing interests.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE BMJ

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Top Predictors of Substance Initiation in Youth Flagged

Article Type
Changed

 

By age 12 years, more than 14% of children have tried alcohol or tobacco, and religion, race, and income are the top predictors beginning to use these and other substances, new research suggests.

Aside from sociodemographic parameters, risk factors for substance use initiation include prenatal exposure to substances, peer use of alcohol and nicotine, and problematic school behavior, among other things, the study showed.

The results show certain modifiable risk factors may play a role in preventing youth from starting to use substances, said study author ReJoyce Green, PhD, research assistant professor, Department of Psychiatry and Behavioral Sciences, Medical University of South Carolina, Charleston.

“If we’re designing, say, a prevention program or an early intervention program, these are things that could make a difference, so let’s make sure we’re bringing them into the conversation.”

The findings were presented at the annual meeting of the American Psychiatric Association American Psychiatric Association (APA) and published online in The American Journal of Psychiatry.
 

Critical Risk Factors

Use of alcohol, tobacco, and cannabis often begins during adolescence. One recent survey showed that 23% of 13-year-olds reported using alcohol, 17% reported vaping nicotine, and 8% reported vaping cannabis. Other research links younger age at substance use initiation to a more rapid transition to substance use disorders and higher rates of psychiatric disorders.

Previous studies examining predictors of substance use initiation in the Adolescent Brain Cognitive Development (ABCD) Study dataset focused primarily on self-reported measures, but the current study also looked at models that include hormones and neurocognitive factors as well as neuroimaging.

This study included 6829, 9- and 10-year-olds from the ABCD Study who had never tried substances and were followed for 3 years.

A sophisticated statistical approach was used to examine 420 variables as predictors of substance use initiation. Initiation was defined as trying any nonprescribed substance by age 12 years. “That’s including a single sip of alcohol or puff of a cigarette,” said Dr. Green.

In addition to alcohol, nicotine, and cannabis, researchers looked at initiation of synthetic cannabinoids, cocaine, methamphetamine, and ketamine, among other substances.

Self-reported measures included demographic characteristics, self and peer involvement with substance use, parenting behaviors, mental and physical health, and culture and environmental factors.

The analytical approach used machine-learning algorithms to compare the ability of domains to identify the most critical risk factors. Magnitudes of coefficients were used to assess variable importance, with positive coefficients indicating greater likelihood of substance initiation and negative coefficients indicating lower likelihood of initiation.

By age 12 years, 14.4% of the children studied reported substance initiation. Alcohol was the substance most commonly initiated (365 individuals), followed by nicotine (94 individuals) and cannabis (40 individuals), with few or no children initiating other substances.

Both those who did and did not initiate substances were similarly aged, and most participants identified as White and non-Hispanic. But the substance-use group had a lower percentage of girls and higher percentage of White participants compared with the no-substance-use group.

The model with only self-reported data had similar accuracy in predicting substance use initiation (area under the curve [AUC], 0.67) as models that added resource-intensive measures such as neurocognitive tests and hormones (AUC, 0.67) and neuroimaging (AUC, 0.66).
 

 

 

Religious Predictors

The strongest predictors of substance use initiation were related to religion: Youths whose parents reported a religious preference for Mormonism were less likely to initiate substance use (coefficient, -0.87), whereas youths whose parents reported a religious preference for Judaism were more likely to initiate substance use (coefficient, 0.32).

The third top predictor was race: Black youths were less likely to initiate substance use (coefficient, -0.32). This was followed by youths whose parents reported a religious preference for Islam who were also less likely to initiate substance use (coefficient, -0.25).

The research examined over 15 different religious categories, “so we really tried to be expansive,” noted Dr. Green.

It’s unclear why some religions appeared to have a protective impact when it comes to substance use initiation whereas others have the opposite effect. Future research could perhaps identify which components of religiosity affect substance use initiation. If so, these aspects could be developed and incorporated into prevention and intervention programs, said Dr. Green.

Next on the list of most important predictors was being a part of a household with an income of $12,000-$15,999; these youths were less likely to initiate substance use (coefficient, 0.22).

Within the culture and environment domain, a history of detention or suspension was a top predictor of substance use initiation (coefficient, 0.20). Prenatal exposure to substance use was also a robust predictor in the physical health category (coefficient, 0.15).

Other predictors included: parents with less than a high school degree or GED (coefficient, -0.14), substance use availability (coefficient, 0.12), and age at baseline (coefficient, 0.12).

The study also showed that better cognitive functioning in selected domains (eg, cognitive control, attention, and language ability) is associated with a greater likelihood of substance use initiation.
 

Shaping Future Prevention

Applying these findings in clinical settings could help tailor prevention and early intervention efforts, said the authors. It might be prudent to allocate resources to collecting data related to self-, peer-, and familial-related factors, “which were more informative in predicting substance use initiation during late childhood and early adolescence in the present study,” they wrote.

Researchers will continue to track these children through to a 10-year follow-up, said Dr. Green. “I’m really curious to see if the factors we found when they were 12 and 13, such as those related to peers and family, still hold when they’re ages 17 and 18, because there’s going to be a huge amount of brain development that’s happening throughout this phase.”

The group that initiated substance use and the group that didn’t initiate substance use were not totally balanced, and sample sizes for some religious categories were small. Another study limitation was that the analytic approach didn’t account for multilevel data within the context of site and families.

Commenting on the findings, Kathleen Brady, MD, PhD, distinguished university professor and director, South Carolina Clinical and Translational Research Institute, Medical University of South Carolina, said that the study is “critical and complex.” This, she said, is especially true as cannabis has become more accessible and potent, and as the federal government reportedly considers reclassifying it from a Schedule I drug (which includes highly dangerous, addictive substances with no medical use) to a Schedule III drug (which can be prescribed as a medication).

“The part that is the most frightening to me is the long-lasting effects that can happen when young people start using high-potency marijuana at an early age,” said Dr. Brady. “So, any information that we can give to parents, to teachers, to the public, and to doctors is important.”

She’s looking forward to getting more “incredibly important” information on substance use initiation as the study progresses and the teens get older. 

The study received support from the National Institute on Alcohol Abuse and Alcoholism and the National Institute on Drug Abuse.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

By age 12 years, more than 14% of children have tried alcohol or tobacco, and religion, race, and income are the top predictors beginning to use these and other substances, new research suggests.

Aside from sociodemographic parameters, risk factors for substance use initiation include prenatal exposure to substances, peer use of alcohol and nicotine, and problematic school behavior, among other things, the study showed.

The results show certain modifiable risk factors may play a role in preventing youth from starting to use substances, said study author ReJoyce Green, PhD, research assistant professor, Department of Psychiatry and Behavioral Sciences, Medical University of South Carolina, Charleston.

“If we’re designing, say, a prevention program or an early intervention program, these are things that could make a difference, so let’s make sure we’re bringing them into the conversation.”

The findings were presented at the annual meeting of the American Psychiatric Association American Psychiatric Association (APA) and published online in The American Journal of Psychiatry.
 

Critical Risk Factors

Use of alcohol, tobacco, and cannabis often begins during adolescence. One recent survey showed that 23% of 13-year-olds reported using alcohol, 17% reported vaping nicotine, and 8% reported vaping cannabis. Other research links younger age at substance use initiation to a more rapid transition to substance use disorders and higher rates of psychiatric disorders.

Previous studies examining predictors of substance use initiation in the Adolescent Brain Cognitive Development (ABCD) Study dataset focused primarily on self-reported measures, but the current study also looked at models that include hormones and neurocognitive factors as well as neuroimaging.

This study included 6829, 9- and 10-year-olds from the ABCD Study who had never tried substances and were followed for 3 years.

A sophisticated statistical approach was used to examine 420 variables as predictors of substance use initiation. Initiation was defined as trying any nonprescribed substance by age 12 years. “That’s including a single sip of alcohol or puff of a cigarette,” said Dr. Green.

In addition to alcohol, nicotine, and cannabis, researchers looked at initiation of synthetic cannabinoids, cocaine, methamphetamine, and ketamine, among other substances.

Self-reported measures included demographic characteristics, self and peer involvement with substance use, parenting behaviors, mental and physical health, and culture and environmental factors.

The analytical approach used machine-learning algorithms to compare the ability of domains to identify the most critical risk factors. Magnitudes of coefficients were used to assess variable importance, with positive coefficients indicating greater likelihood of substance initiation and negative coefficients indicating lower likelihood of initiation.

By age 12 years, 14.4% of the children studied reported substance initiation. Alcohol was the substance most commonly initiated (365 individuals), followed by nicotine (94 individuals) and cannabis (40 individuals), with few or no children initiating other substances.

Both those who did and did not initiate substances were similarly aged, and most participants identified as White and non-Hispanic. But the substance-use group had a lower percentage of girls and higher percentage of White participants compared with the no-substance-use group.

The model with only self-reported data had similar accuracy in predicting substance use initiation (area under the curve [AUC], 0.67) as models that added resource-intensive measures such as neurocognitive tests and hormones (AUC, 0.67) and neuroimaging (AUC, 0.66).
 

 

 

Religious Predictors

The strongest predictors of substance use initiation were related to religion: Youths whose parents reported a religious preference for Mormonism were less likely to initiate substance use (coefficient, -0.87), whereas youths whose parents reported a religious preference for Judaism were more likely to initiate substance use (coefficient, 0.32).

The third top predictor was race: Black youths were less likely to initiate substance use (coefficient, -0.32). This was followed by youths whose parents reported a religious preference for Islam who were also less likely to initiate substance use (coefficient, -0.25).

The research examined over 15 different religious categories, “so we really tried to be expansive,” noted Dr. Green.

It’s unclear why some religions appeared to have a protective impact when it comes to substance use initiation whereas others have the opposite effect. Future research could perhaps identify which components of religiosity affect substance use initiation. If so, these aspects could be developed and incorporated into prevention and intervention programs, said Dr. Green.

Next on the list of most important predictors was being a part of a household with an income of $12,000-$15,999; these youths were less likely to initiate substance use (coefficient, 0.22).

Within the culture and environment domain, a history of detention or suspension was a top predictor of substance use initiation (coefficient, 0.20). Prenatal exposure to substance use was also a robust predictor in the physical health category (coefficient, 0.15).

Other predictors included: parents with less than a high school degree or GED (coefficient, -0.14), substance use availability (coefficient, 0.12), and age at baseline (coefficient, 0.12).

The study also showed that better cognitive functioning in selected domains (eg, cognitive control, attention, and language ability) is associated with a greater likelihood of substance use initiation.
 

Shaping Future Prevention

Applying these findings in clinical settings could help tailor prevention and early intervention efforts, said the authors. It might be prudent to allocate resources to collecting data related to self-, peer-, and familial-related factors, “which were more informative in predicting substance use initiation during late childhood and early adolescence in the present study,” they wrote.

Researchers will continue to track these children through to a 10-year follow-up, said Dr. Green. “I’m really curious to see if the factors we found when they were 12 and 13, such as those related to peers and family, still hold when they’re ages 17 and 18, because there’s going to be a huge amount of brain development that’s happening throughout this phase.”

The group that initiated substance use and the group that didn’t initiate substance use were not totally balanced, and sample sizes for some religious categories were small. Another study limitation was that the analytic approach didn’t account for multilevel data within the context of site and families.

Commenting on the findings, Kathleen Brady, MD, PhD, distinguished university professor and director, South Carolina Clinical and Translational Research Institute, Medical University of South Carolina, said that the study is “critical and complex.” This, she said, is especially true as cannabis has become more accessible and potent, and as the federal government reportedly considers reclassifying it from a Schedule I drug (which includes highly dangerous, addictive substances with no medical use) to a Schedule III drug (which can be prescribed as a medication).

“The part that is the most frightening to me is the long-lasting effects that can happen when young people start using high-potency marijuana at an early age,” said Dr. Brady. “So, any information that we can give to parents, to teachers, to the public, and to doctors is important.”

She’s looking forward to getting more “incredibly important” information on substance use initiation as the study progresses and the teens get older. 

The study received support from the National Institute on Alcohol Abuse and Alcoholism and the National Institute on Drug Abuse.

A version of this article appeared on Medscape.com.

 

By age 12 years, more than 14% of children have tried alcohol or tobacco, and religion, race, and income are the top predictors beginning to use these and other substances, new research suggests.

Aside from sociodemographic parameters, risk factors for substance use initiation include prenatal exposure to substances, peer use of alcohol and nicotine, and problematic school behavior, among other things, the study showed.

The results show certain modifiable risk factors may play a role in preventing youth from starting to use substances, said study author ReJoyce Green, PhD, research assistant professor, Department of Psychiatry and Behavioral Sciences, Medical University of South Carolina, Charleston.

“If we’re designing, say, a prevention program or an early intervention program, these are things that could make a difference, so let’s make sure we’re bringing them into the conversation.”

The findings were presented at the annual meeting of the American Psychiatric Association American Psychiatric Association (APA) and published online in The American Journal of Psychiatry.
 

Critical Risk Factors

Use of alcohol, tobacco, and cannabis often begins during adolescence. One recent survey showed that 23% of 13-year-olds reported using alcohol, 17% reported vaping nicotine, and 8% reported vaping cannabis. Other research links younger age at substance use initiation to a more rapid transition to substance use disorders and higher rates of psychiatric disorders.

Previous studies examining predictors of substance use initiation in the Adolescent Brain Cognitive Development (ABCD) Study dataset focused primarily on self-reported measures, but the current study also looked at models that include hormones and neurocognitive factors as well as neuroimaging.

This study included 6829, 9- and 10-year-olds from the ABCD Study who had never tried substances and were followed for 3 years.

A sophisticated statistical approach was used to examine 420 variables as predictors of substance use initiation. Initiation was defined as trying any nonprescribed substance by age 12 years. “That’s including a single sip of alcohol or puff of a cigarette,” said Dr. Green.

In addition to alcohol, nicotine, and cannabis, researchers looked at initiation of synthetic cannabinoids, cocaine, methamphetamine, and ketamine, among other substances.

Self-reported measures included demographic characteristics, self and peer involvement with substance use, parenting behaviors, mental and physical health, and culture and environmental factors.

The analytical approach used machine-learning algorithms to compare the ability of domains to identify the most critical risk factors. Magnitudes of coefficients were used to assess variable importance, with positive coefficients indicating greater likelihood of substance initiation and negative coefficients indicating lower likelihood of initiation.

By age 12 years, 14.4% of the children studied reported substance initiation. Alcohol was the substance most commonly initiated (365 individuals), followed by nicotine (94 individuals) and cannabis (40 individuals), with few or no children initiating other substances.

Both those who did and did not initiate substances were similarly aged, and most participants identified as White and non-Hispanic. But the substance-use group had a lower percentage of girls and higher percentage of White participants compared with the no-substance-use group.

The model with only self-reported data had similar accuracy in predicting substance use initiation (area under the curve [AUC], 0.67) as models that added resource-intensive measures such as neurocognitive tests and hormones (AUC, 0.67) and neuroimaging (AUC, 0.66).
 

 

 

Religious Predictors

The strongest predictors of substance use initiation were related to religion: Youths whose parents reported a religious preference for Mormonism were less likely to initiate substance use (coefficient, -0.87), whereas youths whose parents reported a religious preference for Judaism were more likely to initiate substance use (coefficient, 0.32).

The third top predictor was race: Black youths were less likely to initiate substance use (coefficient, -0.32). This was followed by youths whose parents reported a religious preference for Islam who were also less likely to initiate substance use (coefficient, -0.25).

The research examined over 15 different religious categories, “so we really tried to be expansive,” noted Dr. Green.

It’s unclear why some religions appeared to have a protective impact when it comes to substance use initiation whereas others have the opposite effect. Future research could perhaps identify which components of religiosity affect substance use initiation. If so, these aspects could be developed and incorporated into prevention and intervention programs, said Dr. Green.

Next on the list of most important predictors was being a part of a household with an income of $12,000-$15,999; these youths were less likely to initiate substance use (coefficient, 0.22).

Within the culture and environment domain, a history of detention or suspension was a top predictor of substance use initiation (coefficient, 0.20). Prenatal exposure to substance use was also a robust predictor in the physical health category (coefficient, 0.15).

Other predictors included: parents with less than a high school degree or GED (coefficient, -0.14), substance use availability (coefficient, 0.12), and age at baseline (coefficient, 0.12).

The study also showed that better cognitive functioning in selected domains (eg, cognitive control, attention, and language ability) is associated with a greater likelihood of substance use initiation.
 

Shaping Future Prevention

Applying these findings in clinical settings could help tailor prevention and early intervention efforts, said the authors. It might be prudent to allocate resources to collecting data related to self-, peer-, and familial-related factors, “which were more informative in predicting substance use initiation during late childhood and early adolescence in the present study,” they wrote.

Researchers will continue to track these children through to a 10-year follow-up, said Dr. Green. “I’m really curious to see if the factors we found when they were 12 and 13, such as those related to peers and family, still hold when they’re ages 17 and 18, because there’s going to be a huge amount of brain development that’s happening throughout this phase.”

The group that initiated substance use and the group that didn’t initiate substance use were not totally balanced, and sample sizes for some religious categories were small. Another study limitation was that the analytic approach didn’t account for multilevel data within the context of site and families.

Commenting on the findings, Kathleen Brady, MD, PhD, distinguished university professor and director, South Carolina Clinical and Translational Research Institute, Medical University of South Carolina, said that the study is “critical and complex.” This, she said, is especially true as cannabis has become more accessible and potent, and as the federal government reportedly considers reclassifying it from a Schedule I drug (which includes highly dangerous, addictive substances with no medical use) to a Schedule III drug (which can be prescribed as a medication).

“The part that is the most frightening to me is the long-lasting effects that can happen when young people start using high-potency marijuana at an early age,” said Dr. Brady. “So, any information that we can give to parents, to teachers, to the public, and to doctors is important.”

She’s looking forward to getting more “incredibly important” information on substance use initiation as the study progresses and the teens get older. 

The study received support from the National Institute on Alcohol Abuse and Alcoholism and the National Institute on Drug Abuse.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM APA 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Can Nectin-4 Expression Predict Response to Bladder Cancer Treatment?

Article Type
Changed

 

The expression of the protein Nectin-4 can predict how patients with metastatic urothelial cancer (m)UC will respond to the anti-Nectin-4 antibody-drug conjugate (ADC) enfortumab vedotin (EV), a new study finds.

Identifying biomarkers to predict how patients will respond to targeted therapies is crucial to improve treatments for patients with cancer, authors Niklas Klümper, MD, with the Department of Urology and Pediatric Urology at University Hospital Bonn, in Germany, and colleagues, wrote in the Journal of Clinical Oncology (doi: 10.1200/JCO.23.01983).

The researchers used a Nectin-4-specific fluorescence in situ hybridization (FISH) assay in an (m)UC cohort of 108 patients to test Nectin-4’s ability to predict responses, analyzing slides with a fluorescence microscope. The copy number variations (CNVs) were correlated with membranous Nectin-4 protein expression, responses to EV treatment, and outcomes.

They also evaluated the prognostic value of Nectin-4 CNVs with biopsies of 103 (m)UC patients not treated with EV. Additionally, they searched The Cancer Genome Atlas (TCGA) data sets (10,712 patients across 32 cancer types) for Nectin-4 CNVs.
 

Why Was This Study Done?

Urothelial carcinoma accounts for 90% of bladder cancer cases globally. Though EV was approved to treat (m)UC in 2019, lasting benefit has been achieved only in a small subset of patients.

EV is given to all without selecting patients based on biomarkers that may predict how well they will respond to EV. In this study, researchers investigated whether response to EV was better when people had amplification — defined as increased numbers of copies — of Nectin-4.
 

How Common Is It for Patients With (m)UC to Have Nectin-4 Amplifications?

Nectin-4 amplifications happen frequently in (m)UC; they occurred in about 26% of the (m)UC patients the researchers studied, according to the new paper.

The amplifications are frequent in other cancer types as well, and this study suggests that this biomarker is a promising candidate for developing Nectin-4–targeted antibody-drug conjugates for other cancers.

“Nectin-4 amplifications can be found in 5%-10% of breast cancer and non–small cell lung cancer, both tumor types with a high impact on all-cancer mortality, which are currently being evaluated for EV response,” the authors wrote.

Currently, (m)UC is the only cancer for which EV is approved as standard-of-care, the researchers explain, in their paper.
 

What Were the Differences Between the EV and Non-EV Groups?

Almost all (27 of the 28) patients in the cohort (96%) who had Nectin-4 amplifications had objective responses to EV compared with 24 of 74 (32%) in the group without amplifications (P less than .001). Among the 96% with a response, 82% had partial response and 14% had a complete response.

The amplifications for those treated with EV were linked with longer progression-free survival (90% 12-month PFS vs 41% for those with nonamplified tumors) and longer overall survival (OS).

For those patients treated with EV who had the amplifications, OS was not reached. This was because the researchers could not calculate the OS at 12 months for this group due to more than half of the patients still being alive at that time. That finding contrasts with a median OS of 8.8 months in those patients treated with EV who did not have the amplifications.

EV-treated patients who had Nectin-4 amplifications had a 92% lower risk of death compared with EV-treated patients without the amplifications, according to an analysis that adjusted for factors including age and sex.

“Importantly, in the non–EV-treated patients with (m)UC, Nectin-4 amplifications have no impact on OS [overall survival], suggesting that Nectin-4 amplifications are neither indicating aggressive nor favorable tumor biology, strengthening its potential value as a pure predictive biomarker,” the researchers wrote.
 

 

 

What Are the Implications of These Findings?

“[O]ur study suggests that Nectin-4 amplification is a simple, valuable, and easy-to-implement predictive biomarker for EV in patients with (m)UC. The frequent occurrence of Nectin-4 amplifications in other cancer types suggests that this biomarker is a promising candidate with broader applicability for clinical development of Nectin-4-targeted ADCs in a tumor-agnostic context.”

Choosing the best therapy sequence for (m)UC is crucial, the authors write. Considering Nectin-4 amplifications could inform EV drug development — even at earlier stages of the disease — by defining which patient subgroup has the highest chance for long-term benefit.

The authors acknowledge that the primary limitation of the study is that it is retrospective, using archived primary and metastatic tumor specimens with varying ranges between the time of tumor sampling and start of EV treatment.

“Therefore, our data are hypothesis-generating and prospective confirmation in larger, biomarker-driven trials is mandatory,” the authors wrote.

They note that EV plus pembrolizumab [Keytruda] (EV/P) was recently approved as the new standard of care in first-line treatment for (m)UC, so the predictive value of Nectin-4 amplification in this new treatment setting warrants further research.

Dr. Klümper reports stock and other ownership interests in Bicycle Therapeutics, Pfizer, Daiichi Sankyo/UCB Japan, and Immatics; and honoraria for Astellas Pharma and MSD Oncology; and consulting or advisory roles with Astellas Pharma, MSD Oncology, and Eisai. He reports travel reimbursements from Ipsen, Photocure, and MSD Oncology. Other author disclosures are available with the full text of the paper.

Publications
Topics
Sections

 

The expression of the protein Nectin-4 can predict how patients with metastatic urothelial cancer (m)UC will respond to the anti-Nectin-4 antibody-drug conjugate (ADC) enfortumab vedotin (EV), a new study finds.

Identifying biomarkers to predict how patients will respond to targeted therapies is crucial to improve treatments for patients with cancer, authors Niklas Klümper, MD, with the Department of Urology and Pediatric Urology at University Hospital Bonn, in Germany, and colleagues, wrote in the Journal of Clinical Oncology (doi: 10.1200/JCO.23.01983).

The researchers used a Nectin-4-specific fluorescence in situ hybridization (FISH) assay in an (m)UC cohort of 108 patients to test Nectin-4’s ability to predict responses, analyzing slides with a fluorescence microscope. The copy number variations (CNVs) were correlated with membranous Nectin-4 protein expression, responses to EV treatment, and outcomes.

They also evaluated the prognostic value of Nectin-4 CNVs with biopsies of 103 (m)UC patients not treated with EV. Additionally, they searched The Cancer Genome Atlas (TCGA) data sets (10,712 patients across 32 cancer types) for Nectin-4 CNVs.
 

Why Was This Study Done?

Urothelial carcinoma accounts for 90% of bladder cancer cases globally. Though EV was approved to treat (m)UC in 2019, lasting benefit has been achieved only in a small subset of patients.

EV is given to all without selecting patients based on biomarkers that may predict how well they will respond to EV. In this study, researchers investigated whether response to EV was better when people had amplification — defined as increased numbers of copies — of Nectin-4.
 

How Common Is It for Patients With (m)UC to Have Nectin-4 Amplifications?

Nectin-4 amplifications happen frequently in (m)UC; they occurred in about 26% of the (m)UC patients the researchers studied, according to the new paper.

The amplifications are frequent in other cancer types as well, and this study suggests that this biomarker is a promising candidate for developing Nectin-4–targeted antibody-drug conjugates for other cancers.

“Nectin-4 amplifications can be found in 5%-10% of breast cancer and non–small cell lung cancer, both tumor types with a high impact on all-cancer mortality, which are currently being evaluated for EV response,” the authors wrote.

Currently, (m)UC is the only cancer for which EV is approved as standard-of-care, the researchers explain, in their paper.
 

What Were the Differences Between the EV and Non-EV Groups?

Almost all (27 of the 28) patients in the cohort (96%) who had Nectin-4 amplifications had objective responses to EV compared with 24 of 74 (32%) in the group without amplifications (P less than .001). Among the 96% with a response, 82% had partial response and 14% had a complete response.

The amplifications for those treated with EV were linked with longer progression-free survival (90% 12-month PFS vs 41% for those with nonamplified tumors) and longer overall survival (OS).

For those patients treated with EV who had the amplifications, OS was not reached. This was because the researchers could not calculate the OS at 12 months for this group due to more than half of the patients still being alive at that time. That finding contrasts with a median OS of 8.8 months in those patients treated with EV who did not have the amplifications.

EV-treated patients who had Nectin-4 amplifications had a 92% lower risk of death compared with EV-treated patients without the amplifications, according to an analysis that adjusted for factors including age and sex.

“Importantly, in the non–EV-treated patients with (m)UC, Nectin-4 amplifications have no impact on OS [overall survival], suggesting that Nectin-4 amplifications are neither indicating aggressive nor favorable tumor biology, strengthening its potential value as a pure predictive biomarker,” the researchers wrote.
 

 

 

What Are the Implications of These Findings?

“[O]ur study suggests that Nectin-4 amplification is a simple, valuable, and easy-to-implement predictive biomarker for EV in patients with (m)UC. The frequent occurrence of Nectin-4 amplifications in other cancer types suggests that this biomarker is a promising candidate with broader applicability for clinical development of Nectin-4-targeted ADCs in a tumor-agnostic context.”

Choosing the best therapy sequence for (m)UC is crucial, the authors write. Considering Nectin-4 amplifications could inform EV drug development — even at earlier stages of the disease — by defining which patient subgroup has the highest chance for long-term benefit.

The authors acknowledge that the primary limitation of the study is that it is retrospective, using archived primary and metastatic tumor specimens with varying ranges between the time of tumor sampling and start of EV treatment.

“Therefore, our data are hypothesis-generating and prospective confirmation in larger, biomarker-driven trials is mandatory,” the authors wrote.

They note that EV plus pembrolizumab [Keytruda] (EV/P) was recently approved as the new standard of care in first-line treatment for (m)UC, so the predictive value of Nectin-4 amplification in this new treatment setting warrants further research.

Dr. Klümper reports stock and other ownership interests in Bicycle Therapeutics, Pfizer, Daiichi Sankyo/UCB Japan, and Immatics; and honoraria for Astellas Pharma and MSD Oncology; and consulting or advisory roles with Astellas Pharma, MSD Oncology, and Eisai. He reports travel reimbursements from Ipsen, Photocure, and MSD Oncology. Other author disclosures are available with the full text of the paper.

 

The expression of the protein Nectin-4 can predict how patients with metastatic urothelial cancer (m)UC will respond to the anti-Nectin-4 antibody-drug conjugate (ADC) enfortumab vedotin (EV), a new study finds.

Identifying biomarkers to predict how patients will respond to targeted therapies is crucial to improve treatments for patients with cancer, authors Niklas Klümper, MD, with the Department of Urology and Pediatric Urology at University Hospital Bonn, in Germany, and colleagues, wrote in the Journal of Clinical Oncology (doi: 10.1200/JCO.23.01983).

The researchers used a Nectin-4-specific fluorescence in situ hybridization (FISH) assay in an (m)UC cohort of 108 patients to test Nectin-4’s ability to predict responses, analyzing slides with a fluorescence microscope. The copy number variations (CNVs) were correlated with membranous Nectin-4 protein expression, responses to EV treatment, and outcomes.

They also evaluated the prognostic value of Nectin-4 CNVs with biopsies of 103 (m)UC patients not treated with EV. Additionally, they searched The Cancer Genome Atlas (TCGA) data sets (10,712 patients across 32 cancer types) for Nectin-4 CNVs.
 

Why Was This Study Done?

Urothelial carcinoma accounts for 90% of bladder cancer cases globally. Though EV was approved to treat (m)UC in 2019, lasting benefit has been achieved only in a small subset of patients.

EV is given to all without selecting patients based on biomarkers that may predict how well they will respond to EV. In this study, researchers investigated whether response to EV was better when people had amplification — defined as increased numbers of copies — of Nectin-4.
 

How Common Is It for Patients With (m)UC to Have Nectin-4 Amplifications?

Nectin-4 amplifications happen frequently in (m)UC; they occurred in about 26% of the (m)UC patients the researchers studied, according to the new paper.

The amplifications are frequent in other cancer types as well, and this study suggests that this biomarker is a promising candidate for developing Nectin-4–targeted antibody-drug conjugates for other cancers.

“Nectin-4 amplifications can be found in 5%-10% of breast cancer and non–small cell lung cancer, both tumor types with a high impact on all-cancer mortality, which are currently being evaluated for EV response,” the authors wrote.

Currently, (m)UC is the only cancer for which EV is approved as standard-of-care, the researchers explain, in their paper.
 

What Were the Differences Between the EV and Non-EV Groups?

Almost all (27 of the 28) patients in the cohort (96%) who had Nectin-4 amplifications had objective responses to EV compared with 24 of 74 (32%) in the group without amplifications (P less than .001). Among the 96% with a response, 82% had partial response and 14% had a complete response.

The amplifications for those treated with EV were linked with longer progression-free survival (90% 12-month PFS vs 41% for those with nonamplified tumors) and longer overall survival (OS).

For those patients treated with EV who had the amplifications, OS was not reached. This was because the researchers could not calculate the OS at 12 months for this group due to more than half of the patients still being alive at that time. That finding contrasts with a median OS of 8.8 months in those patients treated with EV who did not have the amplifications.

EV-treated patients who had Nectin-4 amplifications had a 92% lower risk of death compared with EV-treated patients without the amplifications, according to an analysis that adjusted for factors including age and sex.

“Importantly, in the non–EV-treated patients with (m)UC, Nectin-4 amplifications have no impact on OS [overall survival], suggesting that Nectin-4 amplifications are neither indicating aggressive nor favorable tumor biology, strengthening its potential value as a pure predictive biomarker,” the researchers wrote.
 

 

 

What Are the Implications of These Findings?

“[O]ur study suggests that Nectin-4 amplification is a simple, valuable, and easy-to-implement predictive biomarker for EV in patients with (m)UC. The frequent occurrence of Nectin-4 amplifications in other cancer types suggests that this biomarker is a promising candidate with broader applicability for clinical development of Nectin-4-targeted ADCs in a tumor-agnostic context.”

Choosing the best therapy sequence for (m)UC is crucial, the authors write. Considering Nectin-4 amplifications could inform EV drug development — even at earlier stages of the disease — by defining which patient subgroup has the highest chance for long-term benefit.

The authors acknowledge that the primary limitation of the study is that it is retrospective, using archived primary and metastatic tumor specimens with varying ranges between the time of tumor sampling and start of EV treatment.

“Therefore, our data are hypothesis-generating and prospective confirmation in larger, biomarker-driven trials is mandatory,” the authors wrote.

They note that EV plus pembrolizumab [Keytruda] (EV/P) was recently approved as the new standard of care in first-line treatment for (m)UC, so the predictive value of Nectin-4 amplification in this new treatment setting warrants further research.

Dr. Klümper reports stock and other ownership interests in Bicycle Therapeutics, Pfizer, Daiichi Sankyo/UCB Japan, and Immatics; and honoraria for Astellas Pharma and MSD Oncology; and consulting or advisory roles with Astellas Pharma, MSD Oncology, and Eisai. He reports travel reimbursements from Ipsen, Photocure, and MSD Oncology. Other author disclosures are available with the full text of the paper.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JOURNAL OF CLINICAL ONCOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Jumpstart Your AI Learning: The Very Best Resources for Doctors

Article Type
Changed

 

Like it or not, artificial intelligence (AI) is coming to medicine. For many physicians — maybe you — it’s already here.

More than a third of physicians use AI in their practice. And the vast majority of healthcare companies — 94%, according to Morgan Stanley — use some kind of AI machine learning.

“It’s incumbent on physicians, as well as physicians in training, to become familiar with at least the basics [of AI],” said internist Matthew DeCamp, MD, PhD, an associate professor in the Center for Bioethics and Humanities at the University of Colorado Anschutz Medical Campus, Aurora, Colorado.

Understanding AI can help you leverage it safely and effectively — plus “make better-informed decisions about whether or not to use it in [your] practice,” Dr. DeCamp said.

“Frankly, the people who are deciding whether to implement algorithms in our day-to-day lives are oftentimes not physicians,” noted Ravi B. Parikh, MD, an assistant professor at the University of Pennsylvania and director of augmented and artificial intelligence at the Penn Center for Cancer Care Innovation, Philadelphia. Yet, physicians are most qualified to assess an AI tool’s usefulness in clinical practice.

That brings us to the best starting place for your AI education: Your own institution. Find out what AI tools your organization is implementing — and how you can influence them.

“Getting involved with our hospital data governance is the best way not only to learn practically what these AI tools do but also to influence the development process in positive ways,” Dr. Parikh said.

From there, consider the following resources to enhance your AI knowledge.
 

Get a Lay of the Land: Free Primers

Many clinical societies and interest groups have put out AI primers, an easy way to get a broad overview of the technology. The following were recommended or developed by the experts we spoke to, and all are free:

  • The American Medical Association’s (AMA’s) framework for advancing healthcare AI lays out actionable guidance. Ask three key questions, the AMA recommends: Does it work? Does it work for my patients? Does it improve health outcomes?
  • The Coalition for Health AI’s Blueprint for Trustworthy AI Implementation Guidance and Assurance for Healthcare provides a high-level summary of how to evaluate AI in healthcare, plus steps for implementing it. AI systems should be useful, safe, accountable, explainable, fair, and secure, the report asserted.
  • The National Academy of Medicine’s draft code of conduct for AI in healthcare proposes core principles and commitments. These “reflect simple guideposts to guide and gauge behavior in a complex system and provide a starting point for real-time decision-making,” the report said.
  • Health AI Partnership — a collaboration of Duke Health and Microsoft — outlines eight key decision points to consider at any stage of AI implementation, whether you’re still planning how to use it or you’ve started but want to improve it. The site also provides a breakdown of standards by regulatory agencies, organizations, and oversight bodies — so you can make sure your practices align with their guidance.
 

 

Make the Most of Conferences

Next time you’re at a conference, check the agenda for sessions on AI. “For someone who’s interested in this, I would be looking for content in my next national meeting because, undoubtedly, it’s going to be there,” said Dr. DeCamp. In a fast-moving field like AI, it’s a great way to get fresh, up-to-the-moment insights.

Listen to This Podcast

The New England Journal of Medicine’s free monthly podcast AI Grand Rounds is made for researchers and clinicians. Available on Apple, Spotify, and YouTube, the pod is good for “someone who’s looking to see both where the field is going [and to hear] a retrospective on big-name papers,” said Dr. Parikh . Episodes run for about an hour.

To learn about the challenges of applying AI to biology: Listen to Daphne Koller, PhD, founder of AI-driven drug discovery and development company insitro. For insights on the potential of AI in medicine, tune into the one with Eric Horvitz, MD, PhD, Microsoft’s chief scientific officer.
 

Consider a Class

Look for courses that focus on AI applications in clinical practice rather than a deep dive into theory. (You need to understand how these tools will influence your work, not the intricacies of large language model development.) Be wary of corporate-funded training that centers on one product , which could present conflicts of interest, said Dr. DeCamp. See the chart for courses that meet these criteria.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

Like it or not, artificial intelligence (AI) is coming to medicine. For many physicians — maybe you — it’s already here.

More than a third of physicians use AI in their practice. And the vast majority of healthcare companies — 94%, according to Morgan Stanley — use some kind of AI machine learning.

“It’s incumbent on physicians, as well as physicians in training, to become familiar with at least the basics [of AI],” said internist Matthew DeCamp, MD, PhD, an associate professor in the Center for Bioethics and Humanities at the University of Colorado Anschutz Medical Campus, Aurora, Colorado.

Understanding AI can help you leverage it safely and effectively — plus “make better-informed decisions about whether or not to use it in [your] practice,” Dr. DeCamp said.

“Frankly, the people who are deciding whether to implement algorithms in our day-to-day lives are oftentimes not physicians,” noted Ravi B. Parikh, MD, an assistant professor at the University of Pennsylvania and director of augmented and artificial intelligence at the Penn Center for Cancer Care Innovation, Philadelphia. Yet, physicians are most qualified to assess an AI tool’s usefulness in clinical practice.

That brings us to the best starting place for your AI education: Your own institution. Find out what AI tools your organization is implementing — and how you can influence them.

“Getting involved with our hospital data governance is the best way not only to learn practically what these AI tools do but also to influence the development process in positive ways,” Dr. Parikh said.

From there, consider the following resources to enhance your AI knowledge.
 

Get a Lay of the Land: Free Primers

Many clinical societies and interest groups have put out AI primers, an easy way to get a broad overview of the technology. The following were recommended or developed by the experts we spoke to, and all are free:

  • The American Medical Association’s (AMA’s) framework for advancing healthcare AI lays out actionable guidance. Ask three key questions, the AMA recommends: Does it work? Does it work for my patients? Does it improve health outcomes?
  • The Coalition for Health AI’s Blueprint for Trustworthy AI Implementation Guidance and Assurance for Healthcare provides a high-level summary of how to evaluate AI in healthcare, plus steps for implementing it. AI systems should be useful, safe, accountable, explainable, fair, and secure, the report asserted.
  • The National Academy of Medicine’s draft code of conduct for AI in healthcare proposes core principles and commitments. These “reflect simple guideposts to guide and gauge behavior in a complex system and provide a starting point for real-time decision-making,” the report said.
  • Health AI Partnership — a collaboration of Duke Health and Microsoft — outlines eight key decision points to consider at any stage of AI implementation, whether you’re still planning how to use it or you’ve started but want to improve it. The site also provides a breakdown of standards by regulatory agencies, organizations, and oversight bodies — so you can make sure your practices align with their guidance.
 

 

Make the Most of Conferences

Next time you’re at a conference, check the agenda for sessions on AI. “For someone who’s interested in this, I would be looking for content in my next national meeting because, undoubtedly, it’s going to be there,” said Dr. DeCamp. In a fast-moving field like AI, it’s a great way to get fresh, up-to-the-moment insights.

Listen to This Podcast

The New England Journal of Medicine’s free monthly podcast AI Grand Rounds is made for researchers and clinicians. Available on Apple, Spotify, and YouTube, the pod is good for “someone who’s looking to see both where the field is going [and to hear] a retrospective on big-name papers,” said Dr. Parikh . Episodes run for about an hour.

To learn about the challenges of applying AI to biology: Listen to Daphne Koller, PhD, founder of AI-driven drug discovery and development company insitro. For insights on the potential of AI in medicine, tune into the one with Eric Horvitz, MD, PhD, Microsoft’s chief scientific officer.
 

Consider a Class

Look for courses that focus on AI applications in clinical practice rather than a deep dive into theory. (You need to understand how these tools will influence your work, not the intricacies of large language model development.) Be wary of corporate-funded training that centers on one product , which could present conflicts of interest, said Dr. DeCamp. See the chart for courses that meet these criteria.

A version of this article appeared on Medscape.com.

 

Like it or not, artificial intelligence (AI) is coming to medicine. For many physicians — maybe you — it’s already here.

More than a third of physicians use AI in their practice. And the vast majority of healthcare companies — 94%, according to Morgan Stanley — use some kind of AI machine learning.

“It’s incumbent on physicians, as well as physicians in training, to become familiar with at least the basics [of AI],” said internist Matthew DeCamp, MD, PhD, an associate professor in the Center for Bioethics and Humanities at the University of Colorado Anschutz Medical Campus, Aurora, Colorado.

Understanding AI can help you leverage it safely and effectively — plus “make better-informed decisions about whether or not to use it in [your] practice,” Dr. DeCamp said.

“Frankly, the people who are deciding whether to implement algorithms in our day-to-day lives are oftentimes not physicians,” noted Ravi B. Parikh, MD, an assistant professor at the University of Pennsylvania and director of augmented and artificial intelligence at the Penn Center for Cancer Care Innovation, Philadelphia. Yet, physicians are most qualified to assess an AI tool’s usefulness in clinical practice.

That brings us to the best starting place for your AI education: Your own institution. Find out what AI tools your organization is implementing — and how you can influence them.

“Getting involved with our hospital data governance is the best way not only to learn practically what these AI tools do but also to influence the development process in positive ways,” Dr. Parikh said.

From there, consider the following resources to enhance your AI knowledge.
 

Get a Lay of the Land: Free Primers

Many clinical societies and interest groups have put out AI primers, an easy way to get a broad overview of the technology. The following were recommended or developed by the experts we spoke to, and all are free:

  • The American Medical Association’s (AMA’s) framework for advancing healthcare AI lays out actionable guidance. Ask three key questions, the AMA recommends: Does it work? Does it work for my patients? Does it improve health outcomes?
  • The Coalition for Health AI’s Blueprint for Trustworthy AI Implementation Guidance and Assurance for Healthcare provides a high-level summary of how to evaluate AI in healthcare, plus steps for implementing it. AI systems should be useful, safe, accountable, explainable, fair, and secure, the report asserted.
  • The National Academy of Medicine’s draft code of conduct for AI in healthcare proposes core principles and commitments. These “reflect simple guideposts to guide and gauge behavior in a complex system and provide a starting point for real-time decision-making,” the report said.
  • Health AI Partnership — a collaboration of Duke Health and Microsoft — outlines eight key decision points to consider at any stage of AI implementation, whether you’re still planning how to use it or you’ve started but want to improve it. The site also provides a breakdown of standards by regulatory agencies, organizations, and oversight bodies — so you can make sure your practices align with their guidance.
 

 

Make the Most of Conferences

Next time you’re at a conference, check the agenda for sessions on AI. “For someone who’s interested in this, I would be looking for content in my next national meeting because, undoubtedly, it’s going to be there,” said Dr. DeCamp. In a fast-moving field like AI, it’s a great way to get fresh, up-to-the-moment insights.

Listen to This Podcast

The New England Journal of Medicine’s free monthly podcast AI Grand Rounds is made for researchers and clinicians. Available on Apple, Spotify, and YouTube, the pod is good for “someone who’s looking to see both where the field is going [and to hear] a retrospective on big-name papers,” said Dr. Parikh . Episodes run for about an hour.

To learn about the challenges of applying AI to biology: Listen to Daphne Koller, PhD, founder of AI-driven drug discovery and development company insitro. For insights on the potential of AI in medicine, tune into the one with Eric Horvitz, MD, PhD, Microsoft’s chief scientific officer.
 

Consider a Class

Look for courses that focus on AI applications in clinical practice rather than a deep dive into theory. (You need to understand how these tools will influence your work, not the intricacies of large language model development.) Be wary of corporate-funded training that centers on one product , which could present conflicts of interest, said Dr. DeCamp. See the chart for courses that meet these criteria.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

High-Alcohol Intake in MASLD Increases Risk of Cirrhosis

Critical Need for Early Tx of High-Risk Alcohol Use
Article Type
Changed

One in nine patients with steatotic liver disease reported concurrent alcohol use, and more than 11% reported high-risk consumption, a national study of more than a million US veterans found.

Moreover, the combination of steatotic liver disease and high-risk alcohol intake carried a more than 43% higher long-term risk of liver cirrhosis compared with no alcohol use, according to researchers led by Robert J. Wong, MD, MS, of the Division of Gastroenterology and Hepatology, Veterans Affairs Healthcare System Palo Alto, at Stanford University School of Medicine in Palo Alto, California.

However, the study found that “reducing alcohol use lowers risk of cirrhosis, emphasizing the importance of timely alcohol use assessment and early interventions to address high-risk alcohol use in steatotic liver disease,” Dr. Wong and associates wrote in Gastroenterology.

Although concurrent moderate to heavy alcohol intake would be expected to lead more rapidly to liver disease progression, the existing literature has been conflicting, the authors noted. Several studies have even found moderate alcohol associated with a lower risk of advanced liver disease among MASLD patients, including that by Dunn et al. .
 

The Study

MASLD patients were identified through the US Veterans Affairs Corporate Data Warehouse from January 1, 2010, through December 31, 2017, with follow-up through December 31, 2022.

Alcohol use was assessed by Alcohol Use Disorders Identification Test–Concise (AUDIT-C) scores and was categorized as follows: no alcohol (AUDIT-C = 0), low-risk alcohol use (AUDIT-C 1-2 for women and 1–3 for men), and high-risk alcohol (AUDIT-C ≥ 3 for women and ≥ 4 for men).

Among the 1,156,189 veterans with MASLD, 54.2% reported no alcohol, 34.6% low-risk alcohol, and 11.2% high-risk alcohol use. In median follow-up of nine to 10 years, incidence rates of cirrhosis were .53 per 100 person-years for no use, .42 for low-risk use, and .76 for high-risk use.

In contrast to patients with baseline high-risk alcohol intake who reported no change in use, those who decreased their alcohol intake during follow-up experienced a 39% reduction in the long-term risk of cirrhosis, for a hazard ratio of .61 (95% CI, .45-.83, P < .01).

Dr. Wong
Dr. Robert J. Wong


About 70% of patients were non-Hispanic Whites and more than 90% were male in all consumption categories. The no-alcohol group was older than the high-risk alcohol group: 64 years vs 59.9 years (P < .0001). Compared with the high-risk alcohol group, the no-alcohol group had a significantly greater proportion of comorbid diabetes (62.3% vs 42.5%), hypertension (77.9% vs 69.1%), or cardiovascular disease (40.2% vs 25.9%, P < .0001 for all comparisons).

In a significant study observation, fewer than 5% of patients with high-risk use received behavioral or pharmacologic therapy and of those who did, most were referred for or received treatment at or near the time of cirrhosis diagnosis. “This highlights a major gap in linking patients with high-risk alcohol use to appropriate behavioral or pharmacologic therapy in a timely manner and may reflect missed opportunities to prevent further alcohol-related morbidity and mortality,” Dr. Wong and colleagues wrote.

They called for studies of novel interventions for timely assessment of alcohol use with linkage to addiction services. They cited the need to understand the interaction between levels of alcohol use and underlying MASLD, adding, “More research is also needed to understand whether this interaction varies across different populations.”

This study received no specific funding. Dr. Wong reported funding through his institution from Gilead Sciences, Exact Sciences, and Thera Technologies.
Body

 

Recent consensus in defining metabolic dysfunction-associated steatotic liver disease (MASLD) has raised awareness for the combined impact of cardiometabolic risk factors and alcohol consumption on liver disease progression. This study by Wong et al. highlights the undeniable influence of high-risk alcohol use on the development of advanced liver disease.

In a national cohort of over 1 million US veterans with steatotic liver disease (SLD), patients with high-risk alcohol use based on AUDIT-C assessment exhibited > 43% greater risk of cirrhosis compared to those with no alcohol use. The relationship between alcohol and liver severity in SLD was observed even after excluding patients meeting classification for predominant alcohol-associated liver disease. While increased alcohol use was associated with increased incidence of cirrhosis, decreased alcohol use led to a notable 39% reduction in cirrhosis risk over time.

Reducing alcohol consumption remains best practice guidelines for mitigating risk of progression in steatotic liver disease. However, results of this study emphasize the critical need for early identification and treatment of high-risk alcohol use in all patients with SLD. While universal recommendations for alcohol abstinence provides pragmatic implementation, there is a significant need to better understand the interaction of specific metabolic risk factors and patterns of alcohol use across the spectrum of MetALD to guide personalized recommendations for patient education and management.

Mayo Clinic
Dr. Tiffany Wu


Further research using prospective clinical trial design is needed to evaluate the interplay of alcohol consumption and metabolic risk factors across variable age, sex, genetics, and environmental exposures that are increasingly being recognized as vital drivers of health and disease.

Tiffany Wu, MD, MS, is a fellow in Transplant Hepatology at Mayo Clinic in Rochester, Minnesota. She has no conflicts.

Publications
Topics
Sections
Body

 

Recent consensus in defining metabolic dysfunction-associated steatotic liver disease (MASLD) has raised awareness for the combined impact of cardiometabolic risk factors and alcohol consumption on liver disease progression. This study by Wong et al. highlights the undeniable influence of high-risk alcohol use on the development of advanced liver disease.

In a national cohort of over 1 million US veterans with steatotic liver disease (SLD), patients with high-risk alcohol use based on AUDIT-C assessment exhibited > 43% greater risk of cirrhosis compared to those with no alcohol use. The relationship between alcohol and liver severity in SLD was observed even after excluding patients meeting classification for predominant alcohol-associated liver disease. While increased alcohol use was associated with increased incidence of cirrhosis, decreased alcohol use led to a notable 39% reduction in cirrhosis risk over time.

Reducing alcohol consumption remains best practice guidelines for mitigating risk of progression in steatotic liver disease. However, results of this study emphasize the critical need for early identification and treatment of high-risk alcohol use in all patients with SLD. While universal recommendations for alcohol abstinence provides pragmatic implementation, there is a significant need to better understand the interaction of specific metabolic risk factors and patterns of alcohol use across the spectrum of MetALD to guide personalized recommendations for patient education and management.

Mayo Clinic
Dr. Tiffany Wu


Further research using prospective clinical trial design is needed to evaluate the interplay of alcohol consumption and metabolic risk factors across variable age, sex, genetics, and environmental exposures that are increasingly being recognized as vital drivers of health and disease.

Tiffany Wu, MD, MS, is a fellow in Transplant Hepatology at Mayo Clinic in Rochester, Minnesota. She has no conflicts.

Body

 

Recent consensus in defining metabolic dysfunction-associated steatotic liver disease (MASLD) has raised awareness for the combined impact of cardiometabolic risk factors and alcohol consumption on liver disease progression. This study by Wong et al. highlights the undeniable influence of high-risk alcohol use on the development of advanced liver disease.

In a national cohort of over 1 million US veterans with steatotic liver disease (SLD), patients with high-risk alcohol use based on AUDIT-C assessment exhibited > 43% greater risk of cirrhosis compared to those with no alcohol use. The relationship between alcohol and liver severity in SLD was observed even after excluding patients meeting classification for predominant alcohol-associated liver disease. While increased alcohol use was associated with increased incidence of cirrhosis, decreased alcohol use led to a notable 39% reduction in cirrhosis risk over time.

Reducing alcohol consumption remains best practice guidelines for mitigating risk of progression in steatotic liver disease. However, results of this study emphasize the critical need for early identification and treatment of high-risk alcohol use in all patients with SLD. While universal recommendations for alcohol abstinence provides pragmatic implementation, there is a significant need to better understand the interaction of specific metabolic risk factors and patterns of alcohol use across the spectrum of MetALD to guide personalized recommendations for patient education and management.

Mayo Clinic
Dr. Tiffany Wu


Further research using prospective clinical trial design is needed to evaluate the interplay of alcohol consumption and metabolic risk factors across variable age, sex, genetics, and environmental exposures that are increasingly being recognized as vital drivers of health and disease.

Tiffany Wu, MD, MS, is a fellow in Transplant Hepatology at Mayo Clinic in Rochester, Minnesota. She has no conflicts.

Title
Critical Need for Early Tx of High-Risk Alcohol Use
Critical Need for Early Tx of High-Risk Alcohol Use

One in nine patients with steatotic liver disease reported concurrent alcohol use, and more than 11% reported high-risk consumption, a national study of more than a million US veterans found.

Moreover, the combination of steatotic liver disease and high-risk alcohol intake carried a more than 43% higher long-term risk of liver cirrhosis compared with no alcohol use, according to researchers led by Robert J. Wong, MD, MS, of the Division of Gastroenterology and Hepatology, Veterans Affairs Healthcare System Palo Alto, at Stanford University School of Medicine in Palo Alto, California.

However, the study found that “reducing alcohol use lowers risk of cirrhosis, emphasizing the importance of timely alcohol use assessment and early interventions to address high-risk alcohol use in steatotic liver disease,” Dr. Wong and associates wrote in Gastroenterology.

Although concurrent moderate to heavy alcohol intake would be expected to lead more rapidly to liver disease progression, the existing literature has been conflicting, the authors noted. Several studies have even found moderate alcohol associated with a lower risk of advanced liver disease among MASLD patients, including that by Dunn et al. .
 

The Study

MASLD patients were identified through the US Veterans Affairs Corporate Data Warehouse from January 1, 2010, through December 31, 2017, with follow-up through December 31, 2022.

Alcohol use was assessed by Alcohol Use Disorders Identification Test–Concise (AUDIT-C) scores and was categorized as follows: no alcohol (AUDIT-C = 0), low-risk alcohol use (AUDIT-C 1-2 for women and 1–3 for men), and high-risk alcohol (AUDIT-C ≥ 3 for women and ≥ 4 for men).

Among the 1,156,189 veterans with MASLD, 54.2% reported no alcohol, 34.6% low-risk alcohol, and 11.2% high-risk alcohol use. In median follow-up of nine to 10 years, incidence rates of cirrhosis were .53 per 100 person-years for no use, .42 for low-risk use, and .76 for high-risk use.

In contrast to patients with baseline high-risk alcohol intake who reported no change in use, those who decreased their alcohol intake during follow-up experienced a 39% reduction in the long-term risk of cirrhosis, for a hazard ratio of .61 (95% CI, .45-.83, P < .01).

Dr. Wong
Dr. Robert J. Wong


About 70% of patients were non-Hispanic Whites and more than 90% were male in all consumption categories. The no-alcohol group was older than the high-risk alcohol group: 64 years vs 59.9 years (P < .0001). Compared with the high-risk alcohol group, the no-alcohol group had a significantly greater proportion of comorbid diabetes (62.3% vs 42.5%), hypertension (77.9% vs 69.1%), or cardiovascular disease (40.2% vs 25.9%, P < .0001 for all comparisons).

In a significant study observation, fewer than 5% of patients with high-risk use received behavioral or pharmacologic therapy and of those who did, most were referred for or received treatment at or near the time of cirrhosis diagnosis. “This highlights a major gap in linking patients with high-risk alcohol use to appropriate behavioral or pharmacologic therapy in a timely manner and may reflect missed opportunities to prevent further alcohol-related morbidity and mortality,” Dr. Wong and colleagues wrote.

They called for studies of novel interventions for timely assessment of alcohol use with linkage to addiction services. They cited the need to understand the interaction between levels of alcohol use and underlying MASLD, adding, “More research is also needed to understand whether this interaction varies across different populations.”

This study received no specific funding. Dr. Wong reported funding through his institution from Gilead Sciences, Exact Sciences, and Thera Technologies.

One in nine patients with steatotic liver disease reported concurrent alcohol use, and more than 11% reported high-risk consumption, a national study of more than a million US veterans found.

Moreover, the combination of steatotic liver disease and high-risk alcohol intake carried a more than 43% higher long-term risk of liver cirrhosis compared with no alcohol use, according to researchers led by Robert J. Wong, MD, MS, of the Division of Gastroenterology and Hepatology, Veterans Affairs Healthcare System Palo Alto, at Stanford University School of Medicine in Palo Alto, California.

However, the study found that “reducing alcohol use lowers risk of cirrhosis, emphasizing the importance of timely alcohol use assessment and early interventions to address high-risk alcohol use in steatotic liver disease,” Dr. Wong and associates wrote in Gastroenterology.

Although concurrent moderate to heavy alcohol intake would be expected to lead more rapidly to liver disease progression, the existing literature has been conflicting, the authors noted. Several studies have even found moderate alcohol associated with a lower risk of advanced liver disease among MASLD patients, including that by Dunn et al. .
 

The Study

MASLD patients were identified through the US Veterans Affairs Corporate Data Warehouse from January 1, 2010, through December 31, 2017, with follow-up through December 31, 2022.

Alcohol use was assessed by Alcohol Use Disorders Identification Test–Concise (AUDIT-C) scores and was categorized as follows: no alcohol (AUDIT-C = 0), low-risk alcohol use (AUDIT-C 1-2 for women and 1–3 for men), and high-risk alcohol (AUDIT-C ≥ 3 for women and ≥ 4 for men).

Among the 1,156,189 veterans with MASLD, 54.2% reported no alcohol, 34.6% low-risk alcohol, and 11.2% high-risk alcohol use. In median follow-up of nine to 10 years, incidence rates of cirrhosis were .53 per 100 person-years for no use, .42 for low-risk use, and .76 for high-risk use.

In contrast to patients with baseline high-risk alcohol intake who reported no change in use, those who decreased their alcohol intake during follow-up experienced a 39% reduction in the long-term risk of cirrhosis, for a hazard ratio of .61 (95% CI, .45-.83, P < .01).

Dr. Wong
Dr. Robert J. Wong


About 70% of patients were non-Hispanic Whites and more than 90% were male in all consumption categories. The no-alcohol group was older than the high-risk alcohol group: 64 years vs 59.9 years (P < .0001). Compared with the high-risk alcohol group, the no-alcohol group had a significantly greater proportion of comorbid diabetes (62.3% vs 42.5%), hypertension (77.9% vs 69.1%), or cardiovascular disease (40.2% vs 25.9%, P < .0001 for all comparisons).

In a significant study observation, fewer than 5% of patients with high-risk use received behavioral or pharmacologic therapy and of those who did, most were referred for or received treatment at or near the time of cirrhosis diagnosis. “This highlights a major gap in linking patients with high-risk alcohol use to appropriate behavioral or pharmacologic therapy in a timely manner and may reflect missed opportunities to prevent further alcohol-related morbidity and mortality,” Dr. Wong and colleagues wrote.

They called for studies of novel interventions for timely assessment of alcohol use with linkage to addiction services. They cited the need to understand the interaction between levels of alcohol use and underlying MASLD, adding, “More research is also needed to understand whether this interaction varies across different populations.”

This study received no specific funding. Dr. Wong reported funding through his institution from Gilead Sciences, Exact Sciences, and Thera Technologies.
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM GASTROENTEROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

A Single Meatless Meal Can Benefit Patients With Cirrhosis

Article Type
Changed

Replacing meat with plant-based proteins for just one meal reduced ammonia levels in patients with cirrhosis, a proof-of-concept study showed.

High levels of serum ammonia after protein loads may predict poor outcomes, including hepatic encephalopathy (HE), in patients with cirrhosis, whereas vegetable protein diets are associated with decreased serum ammonia, according to Jasmohan Bajaj, MD, a gastroenterologist at Virginia Commonwealth University School of Medicine and the Richmond VA Medical Center, Richmond, Virginia, and colleagues.

However, changing from meat-based to non–meat-based meals is difficult to do over a long period.

“Previous studies have changed people’s diets completely, expecting them to be on a meatless or vegetarian diet with a similar amount of protein when they’ve been eating meat their entire life,” Dr. Bajaj told this news organization. “That’s not really sustainable in the long run.”

“Our hope is that occasional meal substitutions would be beneficial,” he said. “This study is a first step toward seeing if that works.”

The study was published online on May 2 in Clinical and Translational Gastroenterology.

Meal Type Affects Ammonia Levels Differently

The researchers randomized 30 men with cirrhosis and on a traditional Western meat-based diet into three groups, where they received a pork/beef burger, a vegetarian bean burger, or a burger made of vegan meat substitute. The burgers provided 20 g of protein each, and all meals contained low-fat potato chips, a whole-grain bun, water, and no condiments.

The participants’ median age was 66 years in the meat and vegetarian arms and 71 years in the vegan arm. About half had diabetes, and half had prior HE and were evenly distributed across the treatment arms. Cirrhosis etiologies included hepatitis C virus infection, alcohol, and metabolic-associated steatohepatitis.

Stool microbiome characteristics, changes in ammonia, and metabolomics were compared between and within groups.

In the 3 days prior to the intervention, participants had similar intakes of red meat, poultry, fish, eggs, bread, cheese, rice, fruits, vegetables, yogurt, coffee, tea, and carbonated caffeinated and decaffeinated beverages.

Blood for metabolomics and ammonia was drawn at baseline and hourly for 3 hours post-meal while patients were under observation. All participants completed the entire meal, as shown subsequently by markers of food consumption, and none developed HE symptoms during the observation period.

The composition of the stool microbiome was similar at baseline across groups and remained unchanged. However, serum ammonia increased from baseline in the meat group but not in the vegetarian or vegan groups. The serum microbiome was not analyzed because of the low yield.

Serum metabolomics showed beneficial changes over time associated with branched-chain amino acid metabolism and urea cycle, phospholipid, and acylcarnitine levels in the vegetarian and vegan meal groups compared with the meat-based group.

In contrast, alterations in lipid profiles (higher sphingomyelins and lower lysophospholipids) were seen in the meat group.

The study was limited by its relatively small sample size, focus on the impact of only one meal, and lack of clinical outcomes, sarcopenia assessment, cognitive testing, or urine collection.

“Intermittent meat substitution with vegan or vegetarian alternatives could be helpful in reducing ammonia generation in cirrhosis,” the authors concluded.

The next step “is to substitute one meal two or three times a week, so we can move forward with this analysis and eventually be able to show that the liver is in better shape,” Dr. Bajaj said.

Meanwhile, clinicians should encourage patients with liver disease who eat meat regularly to try to substitute it with protein from plant or dairy sources, at least occasionally, he said. When doing so, “clinicians should ask their patients’ preferences before assuming that they will do everything that you ask them to do because nutrition in cirrhosis is really critical — not only what they eat but also when they eat. Working with a dietitian, like we did in our study, is critical, or at least having access to one if you don’t have one in your practice.”

 

 

Positive Results From a Simple Change

Commenting on the study, Nancy S. Reau, MD, section chief, hepatology and associate director of organ transplantation at Rush Medical College in Chicago, said, “My biggest concern is making sure patients are ingesting enough quality protein and calories because anorexia is a common complication in cirrhosis, and sarcopenia is associated with poor outcomes.

“You don’t want to suggest a change that will result in eating less or skipping a meal,” she said. So, “it is encouraging to see that suggesting a small change, just one meal a day, that may not impact calorie intake could have positive results.”

Dr. Reau added that “it is great to see evidence that this small change also could be a way of decreasing the risk of HE while not compromising on patient nutrition.”

Larger studies with outcome data showing that this approach could prevent readmission in patients hospitalized for HE would be helpful, she said.

The study was partly supported by the ACG Clinical Research Award, VA Merit Review 2I01CX001076, I01CX002472, and NIAAA RO1AA29398. Dr. Bajaj and Dr. Reau reported no conflicts of interest.
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Replacing meat with plant-based proteins for just one meal reduced ammonia levels in patients with cirrhosis, a proof-of-concept study showed.

High levels of serum ammonia after protein loads may predict poor outcomes, including hepatic encephalopathy (HE), in patients with cirrhosis, whereas vegetable protein diets are associated with decreased serum ammonia, according to Jasmohan Bajaj, MD, a gastroenterologist at Virginia Commonwealth University School of Medicine and the Richmond VA Medical Center, Richmond, Virginia, and colleagues.

However, changing from meat-based to non–meat-based meals is difficult to do over a long period.

“Previous studies have changed people’s diets completely, expecting them to be on a meatless or vegetarian diet with a similar amount of protein when they’ve been eating meat their entire life,” Dr. Bajaj told this news organization. “That’s not really sustainable in the long run.”

“Our hope is that occasional meal substitutions would be beneficial,” he said. “This study is a first step toward seeing if that works.”

The study was published online on May 2 in Clinical and Translational Gastroenterology.

Meal Type Affects Ammonia Levels Differently

The researchers randomized 30 men with cirrhosis and on a traditional Western meat-based diet into three groups, where they received a pork/beef burger, a vegetarian bean burger, or a burger made of vegan meat substitute. The burgers provided 20 g of protein each, and all meals contained low-fat potato chips, a whole-grain bun, water, and no condiments.

The participants’ median age was 66 years in the meat and vegetarian arms and 71 years in the vegan arm. About half had diabetes, and half had prior HE and were evenly distributed across the treatment arms. Cirrhosis etiologies included hepatitis C virus infection, alcohol, and metabolic-associated steatohepatitis.

Stool microbiome characteristics, changes in ammonia, and metabolomics were compared between and within groups.

In the 3 days prior to the intervention, participants had similar intakes of red meat, poultry, fish, eggs, bread, cheese, rice, fruits, vegetables, yogurt, coffee, tea, and carbonated caffeinated and decaffeinated beverages.

Blood for metabolomics and ammonia was drawn at baseline and hourly for 3 hours post-meal while patients were under observation. All participants completed the entire meal, as shown subsequently by markers of food consumption, and none developed HE symptoms during the observation period.

The composition of the stool microbiome was similar at baseline across groups and remained unchanged. However, serum ammonia increased from baseline in the meat group but not in the vegetarian or vegan groups. The serum microbiome was not analyzed because of the low yield.

Serum metabolomics showed beneficial changes over time associated with branched-chain amino acid metabolism and urea cycle, phospholipid, and acylcarnitine levels in the vegetarian and vegan meal groups compared with the meat-based group.

In contrast, alterations in lipid profiles (higher sphingomyelins and lower lysophospholipids) were seen in the meat group.

The study was limited by its relatively small sample size, focus on the impact of only one meal, and lack of clinical outcomes, sarcopenia assessment, cognitive testing, or urine collection.

“Intermittent meat substitution with vegan or vegetarian alternatives could be helpful in reducing ammonia generation in cirrhosis,” the authors concluded.

The next step “is to substitute one meal two or three times a week, so we can move forward with this analysis and eventually be able to show that the liver is in better shape,” Dr. Bajaj said.

Meanwhile, clinicians should encourage patients with liver disease who eat meat regularly to try to substitute it with protein from plant or dairy sources, at least occasionally, he said. When doing so, “clinicians should ask their patients’ preferences before assuming that they will do everything that you ask them to do because nutrition in cirrhosis is really critical — not only what they eat but also when they eat. Working with a dietitian, like we did in our study, is critical, or at least having access to one if you don’t have one in your practice.”

 

 

Positive Results From a Simple Change

Commenting on the study, Nancy S. Reau, MD, section chief, hepatology and associate director of organ transplantation at Rush Medical College in Chicago, said, “My biggest concern is making sure patients are ingesting enough quality protein and calories because anorexia is a common complication in cirrhosis, and sarcopenia is associated with poor outcomes.

“You don’t want to suggest a change that will result in eating less or skipping a meal,” she said. So, “it is encouraging to see that suggesting a small change, just one meal a day, that may not impact calorie intake could have positive results.”

Dr. Reau added that “it is great to see evidence that this small change also could be a way of decreasing the risk of HE while not compromising on patient nutrition.”

Larger studies with outcome data showing that this approach could prevent readmission in patients hospitalized for HE would be helpful, she said.

The study was partly supported by the ACG Clinical Research Award, VA Merit Review 2I01CX001076, I01CX002472, and NIAAA RO1AA29398. Dr. Bajaj and Dr. Reau reported no conflicts of interest.
 

A version of this article appeared on Medscape.com.

Replacing meat with plant-based proteins for just one meal reduced ammonia levels in patients with cirrhosis, a proof-of-concept study showed.

High levels of serum ammonia after protein loads may predict poor outcomes, including hepatic encephalopathy (HE), in patients with cirrhosis, whereas vegetable protein diets are associated with decreased serum ammonia, according to Jasmohan Bajaj, MD, a gastroenterologist at Virginia Commonwealth University School of Medicine and the Richmond VA Medical Center, Richmond, Virginia, and colleagues.

However, changing from meat-based to non–meat-based meals is difficult to do over a long period.

“Previous studies have changed people’s diets completely, expecting them to be on a meatless or vegetarian diet with a similar amount of protein when they’ve been eating meat their entire life,” Dr. Bajaj told this news organization. “That’s not really sustainable in the long run.”

“Our hope is that occasional meal substitutions would be beneficial,” he said. “This study is a first step toward seeing if that works.”

The study was published online on May 2 in Clinical and Translational Gastroenterology.

Meal Type Affects Ammonia Levels Differently

The researchers randomized 30 men with cirrhosis and on a traditional Western meat-based diet into three groups, where they received a pork/beef burger, a vegetarian bean burger, or a burger made of vegan meat substitute. The burgers provided 20 g of protein each, and all meals contained low-fat potato chips, a whole-grain bun, water, and no condiments.

The participants’ median age was 66 years in the meat and vegetarian arms and 71 years in the vegan arm. About half had diabetes, and half had prior HE and were evenly distributed across the treatment arms. Cirrhosis etiologies included hepatitis C virus infection, alcohol, and metabolic-associated steatohepatitis.

Stool microbiome characteristics, changes in ammonia, and metabolomics were compared between and within groups.

In the 3 days prior to the intervention, participants had similar intakes of red meat, poultry, fish, eggs, bread, cheese, rice, fruits, vegetables, yogurt, coffee, tea, and carbonated caffeinated and decaffeinated beverages.

Blood for metabolomics and ammonia was drawn at baseline and hourly for 3 hours post-meal while patients were under observation. All participants completed the entire meal, as shown subsequently by markers of food consumption, and none developed HE symptoms during the observation period.

The composition of the stool microbiome was similar at baseline across groups and remained unchanged. However, serum ammonia increased from baseline in the meat group but not in the vegetarian or vegan groups. The serum microbiome was not analyzed because of the low yield.

Serum metabolomics showed beneficial changes over time associated with branched-chain amino acid metabolism and urea cycle, phospholipid, and acylcarnitine levels in the vegetarian and vegan meal groups compared with the meat-based group.

In contrast, alterations in lipid profiles (higher sphingomyelins and lower lysophospholipids) were seen in the meat group.

The study was limited by its relatively small sample size, focus on the impact of only one meal, and lack of clinical outcomes, sarcopenia assessment, cognitive testing, or urine collection.

“Intermittent meat substitution with vegan or vegetarian alternatives could be helpful in reducing ammonia generation in cirrhosis,” the authors concluded.

The next step “is to substitute one meal two or three times a week, so we can move forward with this analysis and eventually be able to show that the liver is in better shape,” Dr. Bajaj said.

Meanwhile, clinicians should encourage patients with liver disease who eat meat regularly to try to substitute it with protein from plant or dairy sources, at least occasionally, he said. When doing so, “clinicians should ask their patients’ preferences before assuming that they will do everything that you ask them to do because nutrition in cirrhosis is really critical — not only what they eat but also when they eat. Working with a dietitian, like we did in our study, is critical, or at least having access to one if you don’t have one in your practice.”

 

 

Positive Results From a Simple Change

Commenting on the study, Nancy S. Reau, MD, section chief, hepatology and associate director of organ transplantation at Rush Medical College in Chicago, said, “My biggest concern is making sure patients are ingesting enough quality protein and calories because anorexia is a common complication in cirrhosis, and sarcopenia is associated with poor outcomes.

“You don’t want to suggest a change that will result in eating less or skipping a meal,” she said. So, “it is encouraging to see that suggesting a small change, just one meal a day, that may not impact calorie intake could have positive results.”

Dr. Reau added that “it is great to see evidence that this small change also could be a way of decreasing the risk of HE while not compromising on patient nutrition.”

Larger studies with outcome data showing that this approach could prevent readmission in patients hospitalized for HE would be helpful, she said.

The study was partly supported by the ACG Clinical Research Award, VA Merit Review 2I01CX001076, I01CX002472, and NIAAA RO1AA29398. Dr. Bajaj and Dr. Reau reported no conflicts of interest.
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Migraine Treatment Outcomes

Article Type
Changed
Display Headline
Migraine Treatment Outcomes
Sex-Based Therapy and Prednisolone for Medication Overuse Headache

Alan Rapoport, MD

Outcomes of Acute and Preventive Migraine Therapy Based on Patient Sex

I previously have addressed myths about migraine as they pertain to men and women. When I found an interesting study recently published in Cephalalgia investigating the effectiveness of calcitonin gene-related peptide receptor (CGRP-R) antagonists (gepants) for acute care and prevention of episodic migraine and CGRP monoclonal antibodies for preventive treatment of episodic and chronic migraine in men and women, I thought I would discuss it here.

The study’s aim was to discern if patient sex contributed to outcomes in each specific treatment arm. Female sex hormones have been recognized as factors in promoting migraine, and women show increased severity, persistence, and comorbidity in migraine profiles, and increased prevalence of migraine relative to men.

Gepants used for acute therapy (ubrogepant, rimegepant, zavegepant) and preventive therapy (atogepant, rimegepant) were studied in this trial. Erenumab, fremanezumab, galcanezumab, and eptinezumab are monoclonal antibodies that either sit on the CGRP receptor (erenumab) or inactivate the CGRP ligand (fremanezumab, galcanezumab, and eptinezumab) and are used for migraine prevention. CGRP-based therapies are not effective in all patients and understanding which patient groups respond preferentially could reduce trial and error in treatment selection. The effectiveness of treatments targeting CGRP or the CGRP receptor may not be uniform in men and women, highlighting the need for further research and understanding of CGRP neurobiology in both sexes.

Key findings:

  • In the trial by Porreca et al: In women, the 3 gepants approved by the FDA for the acute care of migraine (ubrogepant, rimegepant, zavegepant) produced a statistically significant drug effect for the 2-hour pain freedom (2h-PF) endpoint, with an average drug effect of 9.5% (CI: 7.4 to 11.6) and an average number needed to treat (NNT) of 11.
  • Men did not show statistically significant effects with the acute use of gepants. The average drug effect was 2.8%, and the average NNT was 36.
  • For both men and women, CGRP-targeting therapies for prevention of migraine (the 4 monoclonal antibodies) were equally effective; however, possible sex differences remain uncertain and need further study.
  • In patients with chronic migraine, CGRP/CGRP-R antibodies were similarly effective in both men and women.
  • For the 2-hour freedom from most bothersome symptom (2h-MBS) endpoint when gepants were given acutely, the effects were much better in women than men, with an average drug effect of 10.2% and an average NNT of 10.
  • In men, these medications produced observed treatment effects on 2h-MBS with an average drug effect of 3.2% and an average NNT of 32.
  • In men, 5 out of 12 estimates favored placebo over the active treatment, suggesting a treatment with little to no effect.
  • The pooled treatment effects for women were 3 times as large, at 9.2% and 10.2%, respectively.
  • The placebo response rates for 2 of the 3 ubrogepant studies and one of 2 zavegepant studies were higher in men than in women.

The study concludes that, while small molecule CGRP-R antagonists are dramatically effective for acute therapy of migraine in women, available data do not demonstrate effectiveness in men. The treatment effect was found to always favor active treatment numerically for both men and women for prevention of episodic and chronic migraine. The data highlight possible differential effects of CGRP-targeted therapies in different patient populations and the need for increased understanding of CGRP neurobiology in men and women. The study also emphasizes the need to understand which patient groups preferentially respond to CGRP-based therapies to reduce trial and error in treatment. Note that rimegepant data on prevention were not available for analysis at the time of the writing.

It would be interesting to perform a meta-analysis of multiple well-done, large, real-world studies to see if the same differences and similarities are found in men versus women for acute care of migraine and prevention of episodic and chronic migraine. I suspect that we would find that acute care results favor women but that some men do well.

 

The Effectiveness of Prednisolone for Treating Medication Overuse Headache

I often discuss medication overuse headache (MOH), as it is difficult to diagnose and treat, so I wanted to comment on another pertinent study. It is a post hoc analysis of the Registry for Load and Management of Medication Overuse Headache (RELEASE). The RELEASE trial is an ongoing, multicenter, observational, cohort study of MOH that has been conducted in Korea since April 2020. Findings were recently published in Headache by Lee et al.

 

MOH is a secondary headache disorder that develops in patients with a preexisting primary headache when they overuse acute care headache medications of any type except gepants. This includes prescription medications such as triptans, ergots, butalbital-containing medications; opioids; aspirin; acetaminophen; any type of combination medication often containing caffeine; or a combination of medications. This condition significantly impacts patients’ quality of life and productivity, usually increasing the frequency of headaches per month and leading to higher healthcare-related costs.

 

Treating MOH is challenging due to the lack of high-quality drug trials specifically designed for MOH and doctor inexperience. Current evidence is based largely on subgroup analyses of drug trials for the treatment of chronic migraine that contain these patient types.

Withdrawal of acute care headache medications that are being overused has traditionally been considered an important aspect of MOH treatment, although this may be changing. Withdrawal symptoms, such as increased intensity of headache pain, frequency of headaches, and other symptoms like agitation and sleep disturbance, can prevent patients from discontinuing overused medications. Systemic corticosteroids are widely used to reduce these withdrawal headaches, but clinical trials are sparse and have failed to meet proper endpoints. Despite this, corticosteroids have shown potential benefits, such as decreasing withdrawal headaches, reducing the use of rescue medications, and lowering headache intensity at certain time points after treatment.

Given these findings, this published study hypothesized that prednisolone may play a role in converting MOH to non-MOH at 3 months after treatment. The objective was to evaluate the outcome of prednisolone therapy in reversing medication overuse at 3 months posttreatment in patients with MOH using prospective multicenter registry data. Prednisolone was prescribed to 59 out of 309 patients (19.1%) enrolled during this observational study period, with doses ranging from 10 to 40 mg/day for 5-14 days. Of these patients, 228 (73.8%) completed the 3-month follow-up period.

Key findings:

  • The MOH reversal rates at 3 months postbaseline were 76% (31/41) in the prednisolone group and 57.8% (108/187) in the no prednisolone group (p = 0.034).
  • The steroid effect remained significant (adjusted odds ratio, 2.78; 95% confidence interval 1.27-6.1, p = 0.010) after adjusting for the number of monthly headache days at baseline, mode of discontinuation of overused medication, use of early preventive medications, and the number of combined preventive medications.

The study had several strengths, including the multicenter collection of data, prospective follow-ups, and comprehensiveness of data acquisition. However, it also had significant limitations, such as the noninterventional, observational nature of the study, potential bias in steroid prescription (every doctor prescribed what they wanted), and heterogeneity in the patient population. Also, there were a variety of treatments, and they were not standardized. Further external validation may be necessary before generalizing the study results.

Despite these limitations, the results do suggest that prednisolone may be one part of a valid treatment option for patients with MOH. I suspect, if the proper studies are done, we will see that using a good preventive medication, with few adverse events, and with careful education of the patient, formal detoxification will not be necessary when treating many patients with MOH. This has been my experience with MOH treatment utilizing the newer anti-CGRP preventive medications, including the older monoclonal antibodies and the newer gepants.

 

 

Author and Disclosure Information

Dr. Alan Rapoport Professor, Department of Neurology, University of California at Los Angeles, Los Angeles, California; Coluntary Faculty, Department of Neurology, Alan M. Rapoport, MD, Professional Corporation, Beverly Hills, California
Alan M. Rapoport, MD, has disclosed the following relevant financial relationships:
Serve(d) as an advisor for: AbbVie; Biohaven; Cala Health; Pfizer; Teva Pharmaceutical Industries; Theranica; Xoc
Serve(d) as a speaker or a member of a speakers bureau for: AbbVie; Amgen; Biohaven; Pfizer; Impel; Lundbeck; Teva Pharmaceutical Industries
Editor-in-Chief of Neurology Reviews

Publications
Topics
Sections
Author and Disclosure Information

Dr. Alan Rapoport Professor, Department of Neurology, University of California at Los Angeles, Los Angeles, California; Coluntary Faculty, Department of Neurology, Alan M. Rapoport, MD, Professional Corporation, Beverly Hills, California
Alan M. Rapoport, MD, has disclosed the following relevant financial relationships:
Serve(d) as an advisor for: AbbVie; Biohaven; Cala Health; Pfizer; Teva Pharmaceutical Industries; Theranica; Xoc
Serve(d) as a speaker or a member of a speakers bureau for: AbbVie; Amgen; Biohaven; Pfizer; Impel; Lundbeck; Teva Pharmaceutical Industries
Editor-in-Chief of Neurology Reviews

Author and Disclosure Information

Dr. Alan Rapoport Professor, Department of Neurology, University of California at Los Angeles, Los Angeles, California; Coluntary Faculty, Department of Neurology, Alan M. Rapoport, MD, Professional Corporation, Beverly Hills, California
Alan M. Rapoport, MD, has disclosed the following relevant financial relationships:
Serve(d) as an advisor for: AbbVie; Biohaven; Cala Health; Pfizer; Teva Pharmaceutical Industries; Theranica; Xoc
Serve(d) as a speaker or a member of a speakers bureau for: AbbVie; Amgen; Biohaven; Pfizer; Impel; Lundbeck; Teva Pharmaceutical Industries
Editor-in-Chief of Neurology Reviews

Sex-Based Therapy and Prednisolone for Medication Overuse Headache
Sex-Based Therapy and Prednisolone for Medication Overuse Headache

Alan Rapoport, MD

Outcomes of Acute and Preventive Migraine Therapy Based on Patient Sex

I previously have addressed myths about migraine as they pertain to men and women. When I found an interesting study recently published in Cephalalgia investigating the effectiveness of calcitonin gene-related peptide receptor (CGRP-R) antagonists (gepants) for acute care and prevention of episodic migraine and CGRP monoclonal antibodies for preventive treatment of episodic and chronic migraine in men and women, I thought I would discuss it here.

The study’s aim was to discern if patient sex contributed to outcomes in each specific treatment arm. Female sex hormones have been recognized as factors in promoting migraine, and women show increased severity, persistence, and comorbidity in migraine profiles, and increased prevalence of migraine relative to men.

Gepants used for acute therapy (ubrogepant, rimegepant, zavegepant) and preventive therapy (atogepant, rimegepant) were studied in this trial. Erenumab, fremanezumab, galcanezumab, and eptinezumab are monoclonal antibodies that either sit on the CGRP receptor (erenumab) or inactivate the CGRP ligand (fremanezumab, galcanezumab, and eptinezumab) and are used for migraine prevention. CGRP-based therapies are not effective in all patients and understanding which patient groups respond preferentially could reduce trial and error in treatment selection. The effectiveness of treatments targeting CGRP or the CGRP receptor may not be uniform in men and women, highlighting the need for further research and understanding of CGRP neurobiology in both sexes.

Key findings:

  • In the trial by Porreca et al: In women, the 3 gepants approved by the FDA for the acute care of migraine (ubrogepant, rimegepant, zavegepant) produced a statistically significant drug effect for the 2-hour pain freedom (2h-PF) endpoint, with an average drug effect of 9.5% (CI: 7.4 to 11.6) and an average number needed to treat (NNT) of 11.
  • Men did not show statistically significant effects with the acute use of gepants. The average drug effect was 2.8%, and the average NNT was 36.
  • For both men and women, CGRP-targeting therapies for prevention of migraine (the 4 monoclonal antibodies) were equally effective; however, possible sex differences remain uncertain and need further study.
  • In patients with chronic migraine, CGRP/CGRP-R antibodies were similarly effective in both men and women.
  • For the 2-hour freedom from most bothersome symptom (2h-MBS) endpoint when gepants were given acutely, the effects were much better in women than men, with an average drug effect of 10.2% and an average NNT of 10.
  • In men, these medications produced observed treatment effects on 2h-MBS with an average drug effect of 3.2% and an average NNT of 32.
  • In men, 5 out of 12 estimates favored placebo over the active treatment, suggesting a treatment with little to no effect.
  • The pooled treatment effects for women were 3 times as large, at 9.2% and 10.2%, respectively.
  • The placebo response rates for 2 of the 3 ubrogepant studies and one of 2 zavegepant studies were higher in men than in women.

The study concludes that, while small molecule CGRP-R antagonists are dramatically effective for acute therapy of migraine in women, available data do not demonstrate effectiveness in men. The treatment effect was found to always favor active treatment numerically for both men and women for prevention of episodic and chronic migraine. The data highlight possible differential effects of CGRP-targeted therapies in different patient populations and the need for increased understanding of CGRP neurobiology in men and women. The study also emphasizes the need to understand which patient groups preferentially respond to CGRP-based therapies to reduce trial and error in treatment. Note that rimegepant data on prevention were not available for analysis at the time of the writing.

It would be interesting to perform a meta-analysis of multiple well-done, large, real-world studies to see if the same differences and similarities are found in men versus women for acute care of migraine and prevention of episodic and chronic migraine. I suspect that we would find that acute care results favor women but that some men do well.

 

The Effectiveness of Prednisolone for Treating Medication Overuse Headache

I often discuss medication overuse headache (MOH), as it is difficult to diagnose and treat, so I wanted to comment on another pertinent study. It is a post hoc analysis of the Registry for Load and Management of Medication Overuse Headache (RELEASE). The RELEASE trial is an ongoing, multicenter, observational, cohort study of MOH that has been conducted in Korea since April 2020. Findings were recently published in Headache by Lee et al.

 

MOH is a secondary headache disorder that develops in patients with a preexisting primary headache when they overuse acute care headache medications of any type except gepants. This includes prescription medications such as triptans, ergots, butalbital-containing medications; opioids; aspirin; acetaminophen; any type of combination medication often containing caffeine; or a combination of medications. This condition significantly impacts patients’ quality of life and productivity, usually increasing the frequency of headaches per month and leading to higher healthcare-related costs.

 

Treating MOH is challenging due to the lack of high-quality drug trials specifically designed for MOH and doctor inexperience. Current evidence is based largely on subgroup analyses of drug trials for the treatment of chronic migraine that contain these patient types.

Withdrawal of acute care headache medications that are being overused has traditionally been considered an important aspect of MOH treatment, although this may be changing. Withdrawal symptoms, such as increased intensity of headache pain, frequency of headaches, and other symptoms like agitation and sleep disturbance, can prevent patients from discontinuing overused medications. Systemic corticosteroids are widely used to reduce these withdrawal headaches, but clinical trials are sparse and have failed to meet proper endpoints. Despite this, corticosteroids have shown potential benefits, such as decreasing withdrawal headaches, reducing the use of rescue medications, and lowering headache intensity at certain time points after treatment.

Given these findings, this published study hypothesized that prednisolone may play a role in converting MOH to non-MOH at 3 months after treatment. The objective was to evaluate the outcome of prednisolone therapy in reversing medication overuse at 3 months posttreatment in patients with MOH using prospective multicenter registry data. Prednisolone was prescribed to 59 out of 309 patients (19.1%) enrolled during this observational study period, with doses ranging from 10 to 40 mg/day for 5-14 days. Of these patients, 228 (73.8%) completed the 3-month follow-up period.

Key findings:

  • The MOH reversal rates at 3 months postbaseline were 76% (31/41) in the prednisolone group and 57.8% (108/187) in the no prednisolone group (p = 0.034).
  • The steroid effect remained significant (adjusted odds ratio, 2.78; 95% confidence interval 1.27-6.1, p = 0.010) after adjusting for the number of monthly headache days at baseline, mode of discontinuation of overused medication, use of early preventive medications, and the number of combined preventive medications.

The study had several strengths, including the multicenter collection of data, prospective follow-ups, and comprehensiveness of data acquisition. However, it also had significant limitations, such as the noninterventional, observational nature of the study, potential bias in steroid prescription (every doctor prescribed what they wanted), and heterogeneity in the patient population. Also, there were a variety of treatments, and they were not standardized. Further external validation may be necessary before generalizing the study results.

Despite these limitations, the results do suggest that prednisolone may be one part of a valid treatment option for patients with MOH. I suspect, if the proper studies are done, we will see that using a good preventive medication, with few adverse events, and with careful education of the patient, formal detoxification will not be necessary when treating many patients with MOH. This has been my experience with MOH treatment utilizing the newer anti-CGRP preventive medications, including the older monoclonal antibodies and the newer gepants.

 

 

Alan Rapoport, MD

Outcomes of Acute and Preventive Migraine Therapy Based on Patient Sex

I previously have addressed myths about migraine as they pertain to men and women. When I found an interesting study recently published in Cephalalgia investigating the effectiveness of calcitonin gene-related peptide receptor (CGRP-R) antagonists (gepants) for acute care and prevention of episodic migraine and CGRP monoclonal antibodies for preventive treatment of episodic and chronic migraine in men and women, I thought I would discuss it here.

The study’s aim was to discern if patient sex contributed to outcomes in each specific treatment arm. Female sex hormones have been recognized as factors in promoting migraine, and women show increased severity, persistence, and comorbidity in migraine profiles, and increased prevalence of migraine relative to men.

Gepants used for acute therapy (ubrogepant, rimegepant, zavegepant) and preventive therapy (atogepant, rimegepant) were studied in this trial. Erenumab, fremanezumab, galcanezumab, and eptinezumab are monoclonal antibodies that either sit on the CGRP receptor (erenumab) or inactivate the CGRP ligand (fremanezumab, galcanezumab, and eptinezumab) and are used for migraine prevention. CGRP-based therapies are not effective in all patients and understanding which patient groups respond preferentially could reduce trial and error in treatment selection. The effectiveness of treatments targeting CGRP or the CGRP receptor may not be uniform in men and women, highlighting the need for further research and understanding of CGRP neurobiology in both sexes.

Key findings:

  • In the trial by Porreca et al: In women, the 3 gepants approved by the FDA for the acute care of migraine (ubrogepant, rimegepant, zavegepant) produced a statistically significant drug effect for the 2-hour pain freedom (2h-PF) endpoint, with an average drug effect of 9.5% (CI: 7.4 to 11.6) and an average number needed to treat (NNT) of 11.
  • Men did not show statistically significant effects with the acute use of gepants. The average drug effect was 2.8%, and the average NNT was 36.
  • For both men and women, CGRP-targeting therapies for prevention of migraine (the 4 monoclonal antibodies) were equally effective; however, possible sex differences remain uncertain and need further study.
  • In patients with chronic migraine, CGRP/CGRP-R antibodies were similarly effective in both men and women.
  • For the 2-hour freedom from most bothersome symptom (2h-MBS) endpoint when gepants were given acutely, the effects were much better in women than men, with an average drug effect of 10.2% and an average NNT of 10.
  • In men, these medications produced observed treatment effects on 2h-MBS with an average drug effect of 3.2% and an average NNT of 32.
  • In men, 5 out of 12 estimates favored placebo over the active treatment, suggesting a treatment with little to no effect.
  • The pooled treatment effects for women were 3 times as large, at 9.2% and 10.2%, respectively.
  • The placebo response rates for 2 of the 3 ubrogepant studies and one of 2 zavegepant studies were higher in men than in women.

The study concludes that, while small molecule CGRP-R antagonists are dramatically effective for acute therapy of migraine in women, available data do not demonstrate effectiveness in men. The treatment effect was found to always favor active treatment numerically for both men and women for prevention of episodic and chronic migraine. The data highlight possible differential effects of CGRP-targeted therapies in different patient populations and the need for increased understanding of CGRP neurobiology in men and women. The study also emphasizes the need to understand which patient groups preferentially respond to CGRP-based therapies to reduce trial and error in treatment. Note that rimegepant data on prevention were not available for analysis at the time of the writing.

It would be interesting to perform a meta-analysis of multiple well-done, large, real-world studies to see if the same differences and similarities are found in men versus women for acute care of migraine and prevention of episodic and chronic migraine. I suspect that we would find that acute care results favor women but that some men do well.

 

The Effectiveness of Prednisolone for Treating Medication Overuse Headache

I often discuss medication overuse headache (MOH), as it is difficult to diagnose and treat, so I wanted to comment on another pertinent study. It is a post hoc analysis of the Registry for Load and Management of Medication Overuse Headache (RELEASE). The RELEASE trial is an ongoing, multicenter, observational, cohort study of MOH that has been conducted in Korea since April 2020. Findings were recently published in Headache by Lee et al.

 

MOH is a secondary headache disorder that develops in patients with a preexisting primary headache when they overuse acute care headache medications of any type except gepants. This includes prescription medications such as triptans, ergots, butalbital-containing medications; opioids; aspirin; acetaminophen; any type of combination medication often containing caffeine; or a combination of medications. This condition significantly impacts patients’ quality of life and productivity, usually increasing the frequency of headaches per month and leading to higher healthcare-related costs.

 

Treating MOH is challenging due to the lack of high-quality drug trials specifically designed for MOH and doctor inexperience. Current evidence is based largely on subgroup analyses of drug trials for the treatment of chronic migraine that contain these patient types.

Withdrawal of acute care headache medications that are being overused has traditionally been considered an important aspect of MOH treatment, although this may be changing. Withdrawal symptoms, such as increased intensity of headache pain, frequency of headaches, and other symptoms like agitation and sleep disturbance, can prevent patients from discontinuing overused medications. Systemic corticosteroids are widely used to reduce these withdrawal headaches, but clinical trials are sparse and have failed to meet proper endpoints. Despite this, corticosteroids have shown potential benefits, such as decreasing withdrawal headaches, reducing the use of rescue medications, and lowering headache intensity at certain time points after treatment.

Given these findings, this published study hypothesized that prednisolone may play a role in converting MOH to non-MOH at 3 months after treatment. The objective was to evaluate the outcome of prednisolone therapy in reversing medication overuse at 3 months posttreatment in patients with MOH using prospective multicenter registry data. Prednisolone was prescribed to 59 out of 309 patients (19.1%) enrolled during this observational study period, with doses ranging from 10 to 40 mg/day for 5-14 days. Of these patients, 228 (73.8%) completed the 3-month follow-up period.

Key findings:

  • The MOH reversal rates at 3 months postbaseline were 76% (31/41) in the prednisolone group and 57.8% (108/187) in the no prednisolone group (p = 0.034).
  • The steroid effect remained significant (adjusted odds ratio, 2.78; 95% confidence interval 1.27-6.1, p = 0.010) after adjusting for the number of monthly headache days at baseline, mode of discontinuation of overused medication, use of early preventive medications, and the number of combined preventive medications.

The study had several strengths, including the multicenter collection of data, prospective follow-ups, and comprehensiveness of data acquisition. However, it also had significant limitations, such as the noninterventional, observational nature of the study, potential bias in steroid prescription (every doctor prescribed what they wanted), and heterogeneity in the patient population. Also, there were a variety of treatments, and they were not standardized. Further external validation may be necessary before generalizing the study results.

Despite these limitations, the results do suggest that prednisolone may be one part of a valid treatment option for patients with MOH. I suspect, if the proper studies are done, we will see that using a good preventive medication, with few adverse events, and with careful education of the patient, formal detoxification will not be necessary when treating many patients with MOH. This has been my experience with MOH treatment utilizing the newer anti-CGRP preventive medications, including the older monoclonal antibodies and the newer gepants.

 

 

Publications
Publications
Topics
Article Type
Display Headline
Migraine Treatment Outcomes
Display Headline
Migraine Treatment Outcomes
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Un-Gate On Date
Use ProPublica
CFC Schedule Remove Status
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article