Clozapine Underutilized in Black Patients With Schizophrenia

Article Type
Changed
Mon, 04/15/2024 - 12:15

 

TOPLINE:

Black patients with schizophrenia are less likely to receive a clozapine prescription compared with White patients, a new study shows. The findings held even after the researchers controlled for demographic variables, social determinants of health, and care access patterns.

METHODOLOGY:

  • The study drew on structured electronic health record data on 3160 adult patients with schizophrenia.
  • The mean age at first recorded diagnosis was 39.5 years; 70% of participants were male, 53% Black, and 91% resided in an urban setting.
  • The researchers used the social vulnerability index (SVI) to quantify social determinants of health.
  • Descriptive data analysis, logistic regression, and sensitivity analysis were used to identify differences between those who received a clozapine prescription and those who were prescribed antipsychotic medications other than clozapine.

TAKEAWAY:

  • Overall, 401 patients received a clozapine prescription, 51% of whom were White and 40% were Black.
  • Moreover, 19% of all White patients in the study received clozapine vs 10% of Black patients.
  • After the researchers controlled for demographic variables, SVI scores, and care patterns, White patients were significantly more likely to receive a clozapine prescription than Black patients (adjusted odds ratio [aOR], 1.71; P < .001).
  • Factors that had a statistically significant influence on the likelihood of receiving a clozapine prescription were minority status and language (OR, 2.97; P < .007), treatment duration (OR, 1.36; P < .001), and socioeconomic status (OR, 0.27; P = .001).

IN PRACTICE:

“The reasons for the underprescription of clozapine among Black patients with schizophrenia are multifactorial and may include concerns about benign ethnic neutropenia, prescriber bias, prescribers’ anticipation of patients’ nonadherence to the treatment, and the notion that the medication is less effective for Black patients,” the authors wrote.

SOURCE:

Xiaoming Zeng, MD, PhD, professor of psychiatry, University of North Carolina, Chapel Hill, North Carolina, was the senior and corresponding on the study. It was published online on March 19 in Psychiatric Services.

LIMITATIONS:

Due to the study’s cross-sectional and single-site design, the findings may not be generalizable to other geographic areas or institutions. The study lacked information on substance use disorders, common health conditions, or other patient-level data. A question remains whether all patients who received clozapine actually had treatment-resistant schizophrenia because other research has shown that there is an overdiagnosis of schizophrenia among Black patients.

DISCLOSURES:

The study was supported by a grant from the Foundation of Hope. Dr. Zeng reported no relevant financial relationships. The other authors’ disclosures are listed on the original paper.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Black patients with schizophrenia are less likely to receive a clozapine prescription compared with White patients, a new study shows. The findings held even after the researchers controlled for demographic variables, social determinants of health, and care access patterns.

METHODOLOGY:

  • The study drew on structured electronic health record data on 3160 adult patients with schizophrenia.
  • The mean age at first recorded diagnosis was 39.5 years; 70% of participants were male, 53% Black, and 91% resided in an urban setting.
  • The researchers used the social vulnerability index (SVI) to quantify social determinants of health.
  • Descriptive data analysis, logistic regression, and sensitivity analysis were used to identify differences between those who received a clozapine prescription and those who were prescribed antipsychotic medications other than clozapine.

TAKEAWAY:

  • Overall, 401 patients received a clozapine prescription, 51% of whom were White and 40% were Black.
  • Moreover, 19% of all White patients in the study received clozapine vs 10% of Black patients.
  • After the researchers controlled for demographic variables, SVI scores, and care patterns, White patients were significantly more likely to receive a clozapine prescription than Black patients (adjusted odds ratio [aOR], 1.71; P < .001).
  • Factors that had a statistically significant influence on the likelihood of receiving a clozapine prescription were minority status and language (OR, 2.97; P < .007), treatment duration (OR, 1.36; P < .001), and socioeconomic status (OR, 0.27; P = .001).

IN PRACTICE:

“The reasons for the underprescription of clozapine among Black patients with schizophrenia are multifactorial and may include concerns about benign ethnic neutropenia, prescriber bias, prescribers’ anticipation of patients’ nonadherence to the treatment, and the notion that the medication is less effective for Black patients,” the authors wrote.

SOURCE:

Xiaoming Zeng, MD, PhD, professor of psychiatry, University of North Carolina, Chapel Hill, North Carolina, was the senior and corresponding on the study. It was published online on March 19 in Psychiatric Services.

LIMITATIONS:

Due to the study’s cross-sectional and single-site design, the findings may not be generalizable to other geographic areas or institutions. The study lacked information on substance use disorders, common health conditions, or other patient-level data. A question remains whether all patients who received clozapine actually had treatment-resistant schizophrenia because other research has shown that there is an overdiagnosis of schizophrenia among Black patients.

DISCLOSURES:

The study was supported by a grant from the Foundation of Hope. Dr. Zeng reported no relevant financial relationships. The other authors’ disclosures are listed on the original paper.

A version of this article appeared on Medscape.com.

 

TOPLINE:

Black patients with schizophrenia are less likely to receive a clozapine prescription compared with White patients, a new study shows. The findings held even after the researchers controlled for demographic variables, social determinants of health, and care access patterns.

METHODOLOGY:

  • The study drew on structured electronic health record data on 3160 adult patients with schizophrenia.
  • The mean age at first recorded diagnosis was 39.5 years; 70% of participants were male, 53% Black, and 91% resided in an urban setting.
  • The researchers used the social vulnerability index (SVI) to quantify social determinants of health.
  • Descriptive data analysis, logistic regression, and sensitivity analysis were used to identify differences between those who received a clozapine prescription and those who were prescribed antipsychotic medications other than clozapine.

TAKEAWAY:

  • Overall, 401 patients received a clozapine prescription, 51% of whom were White and 40% were Black.
  • Moreover, 19% of all White patients in the study received clozapine vs 10% of Black patients.
  • After the researchers controlled for demographic variables, SVI scores, and care patterns, White patients were significantly more likely to receive a clozapine prescription than Black patients (adjusted odds ratio [aOR], 1.71; P < .001).
  • Factors that had a statistically significant influence on the likelihood of receiving a clozapine prescription were minority status and language (OR, 2.97; P < .007), treatment duration (OR, 1.36; P < .001), and socioeconomic status (OR, 0.27; P = .001).

IN PRACTICE:

“The reasons for the underprescription of clozapine among Black patients with schizophrenia are multifactorial and may include concerns about benign ethnic neutropenia, prescriber bias, prescribers’ anticipation of patients’ nonadherence to the treatment, and the notion that the medication is less effective for Black patients,” the authors wrote.

SOURCE:

Xiaoming Zeng, MD, PhD, professor of psychiatry, University of North Carolina, Chapel Hill, North Carolina, was the senior and corresponding on the study. It was published online on March 19 in Psychiatric Services.

LIMITATIONS:

Due to the study’s cross-sectional and single-site design, the findings may not be generalizable to other geographic areas or institutions. The study lacked information on substance use disorders, common health conditions, or other patient-level data. A question remains whether all patients who received clozapine actually had treatment-resistant schizophrenia because other research has shown that there is an overdiagnosis of schizophrenia among Black patients.

DISCLOSURES:

The study was supported by a grant from the Foundation of Hope. Dr. Zeng reported no relevant financial relationships. The other authors’ disclosures are listed on the original paper.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Infant Microbiome Development Minimally Affected by Diet, but Metabolite Profiles Differ

Article Type
Changed
Mon, 04/15/2024 - 12:11

 

TOPLINE:

Diet has only a marginal impact on microbiome development in infancy, although metabolite profiles differ between breast- and formula-fed infants; circadian rhythm of the gut microbiome is detectable as early as 2 weeks after birth.

METHODOLOGY:

  • A randomized, controlled interventional trial compared microbiota development in 210 newborns who were exclusively breastfed or received one of four formulas: Un-supplemented formula, Bifidobacterium-supplemented formula, galacto-oligosaccharide (GOS)-supplemented, or formula containing GOSs and bifidobacteria. Exclusively breastfed infants served as a reference group to evaluate the impact of infant formula feeding.
  • Researchers tracked the infants’ microbiota and metabolite profiles in response to the different feeding modes via stool samples collected periodically during the first 1-2 years of life.
  • They also made note of the time of day that the stool sample was collected to assess 24-hour oscillations of the microbiome in relation to dietary exposure.

TAKEAWAY:

  • Global microbiota assembly of infants is primarily affected by age and less so by diet. All infants showed a gradual increase in gut microbe diversity, and at 24 months, there was no observable difference between the groups.
  • However, gut metabolite profiles differed significantly between exclusively formula-fed and exclusively breastfed infants. None of the supplemented formulas were able to fully recreate the breast milk-related microbial environment.
  • GOS-supplemented formula was more effective at promoting sustained levels of bifidobacteria than formula containing bifidobacteria.
  • Metabolic and bacterial profiling revealed 24-hour fluctuations and circadian networks as early as 2 weeks after birth. Infant microbes maintained circadian rhythms when grown in continuous culture, even in the absence of external light or host cues, suggesting an intrinsic clock mechanism in bacteria.

IN PRACTICE:

“Our findings warrant the need for further analysis of circadian fluctuations of both bacteria and metabolites and their functional role in contributing to the benefits of infant nutrition,” the study authors wrote.

SOURCE:

The study was published online April 2 in Cell Host & Microbe.

LIMITATIONS:

The group size for exclusively formula-fed infants was limited, and the explicit contribution of breast milk, relative to infant formula, to bacterial rhythms remains unclear. A possible limitation of the circadian analysis is that the number of fecal samples collected during the night was lower than during the daytime and decreased with age.

DISCLOSURES:

This research was supported by Töpfer GmbH, the German Research Foundation, the Joint Programming Initiative of the European Union, and the German Ministry of Education and Research. The authors had disclosed no relevant conflicts of interest.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Diet has only a marginal impact on microbiome development in infancy, although metabolite profiles differ between breast- and formula-fed infants; circadian rhythm of the gut microbiome is detectable as early as 2 weeks after birth.

METHODOLOGY:

  • A randomized, controlled interventional trial compared microbiota development in 210 newborns who were exclusively breastfed or received one of four formulas: Un-supplemented formula, Bifidobacterium-supplemented formula, galacto-oligosaccharide (GOS)-supplemented, or formula containing GOSs and bifidobacteria. Exclusively breastfed infants served as a reference group to evaluate the impact of infant formula feeding.
  • Researchers tracked the infants’ microbiota and metabolite profiles in response to the different feeding modes via stool samples collected periodically during the first 1-2 years of life.
  • They also made note of the time of day that the stool sample was collected to assess 24-hour oscillations of the microbiome in relation to dietary exposure.

TAKEAWAY:

  • Global microbiota assembly of infants is primarily affected by age and less so by diet. All infants showed a gradual increase in gut microbe diversity, and at 24 months, there was no observable difference between the groups.
  • However, gut metabolite profiles differed significantly between exclusively formula-fed and exclusively breastfed infants. None of the supplemented formulas were able to fully recreate the breast milk-related microbial environment.
  • GOS-supplemented formula was more effective at promoting sustained levels of bifidobacteria than formula containing bifidobacteria.
  • Metabolic and bacterial profiling revealed 24-hour fluctuations and circadian networks as early as 2 weeks after birth. Infant microbes maintained circadian rhythms when grown in continuous culture, even in the absence of external light or host cues, suggesting an intrinsic clock mechanism in bacteria.

IN PRACTICE:

“Our findings warrant the need for further analysis of circadian fluctuations of both bacteria and metabolites and their functional role in contributing to the benefits of infant nutrition,” the study authors wrote.

SOURCE:

The study was published online April 2 in Cell Host & Microbe.

LIMITATIONS:

The group size for exclusively formula-fed infants was limited, and the explicit contribution of breast milk, relative to infant formula, to bacterial rhythms remains unclear. A possible limitation of the circadian analysis is that the number of fecal samples collected during the night was lower than during the daytime and decreased with age.

DISCLOSURES:

This research was supported by Töpfer GmbH, the German Research Foundation, the Joint Programming Initiative of the European Union, and the German Ministry of Education and Research. The authors had disclosed no relevant conflicts of interest.

A version of this article appeared on Medscape.com.

 

TOPLINE:

Diet has only a marginal impact on microbiome development in infancy, although metabolite profiles differ between breast- and formula-fed infants; circadian rhythm of the gut microbiome is detectable as early as 2 weeks after birth.

METHODOLOGY:

  • A randomized, controlled interventional trial compared microbiota development in 210 newborns who were exclusively breastfed or received one of four formulas: Un-supplemented formula, Bifidobacterium-supplemented formula, galacto-oligosaccharide (GOS)-supplemented, or formula containing GOSs and bifidobacteria. Exclusively breastfed infants served as a reference group to evaluate the impact of infant formula feeding.
  • Researchers tracked the infants’ microbiota and metabolite profiles in response to the different feeding modes via stool samples collected periodically during the first 1-2 years of life.
  • They also made note of the time of day that the stool sample was collected to assess 24-hour oscillations of the microbiome in relation to dietary exposure.

TAKEAWAY:

  • Global microbiota assembly of infants is primarily affected by age and less so by diet. All infants showed a gradual increase in gut microbe diversity, and at 24 months, there was no observable difference between the groups.
  • However, gut metabolite profiles differed significantly between exclusively formula-fed and exclusively breastfed infants. None of the supplemented formulas were able to fully recreate the breast milk-related microbial environment.
  • GOS-supplemented formula was more effective at promoting sustained levels of bifidobacteria than formula containing bifidobacteria.
  • Metabolic and bacterial profiling revealed 24-hour fluctuations and circadian networks as early as 2 weeks after birth. Infant microbes maintained circadian rhythms when grown in continuous culture, even in the absence of external light or host cues, suggesting an intrinsic clock mechanism in bacteria.

IN PRACTICE:

“Our findings warrant the need for further analysis of circadian fluctuations of both bacteria and metabolites and their functional role in contributing to the benefits of infant nutrition,” the study authors wrote.

SOURCE:

The study was published online April 2 in Cell Host & Microbe.

LIMITATIONS:

The group size for exclusively formula-fed infants was limited, and the explicit contribution of breast milk, relative to infant formula, to bacterial rhythms remains unclear. A possible limitation of the circadian analysis is that the number of fecal samples collected during the night was lower than during the daytime and decreased with age.

DISCLOSURES:

This research was supported by Töpfer GmbH, the German Research Foundation, the Joint Programming Initiative of the European Union, and the German Ministry of Education and Research. The authors had disclosed no relevant conflicts of interest.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Do Real-World Data Support Omitting Sentinel Lymph Node Biopsy in Early Stage Breast Cancer?

Article Type
Changed
Tue, 04/16/2024 - 16:36

Ultrasound for assessing lymph nodal involvement may be substituted for sentinel lymph node biopsy with no change in outcomes in patients with early breast cancer, a new study finds.

This was the conclusion of research on the agenda at the American Society of Breast Surgeons annual meeting.

Sentinel lymph node biopsy (SLNB) is the standard of care for individuals with early-stage HR+HER2- breast cancer to assess nodal involvement, but SLNB can bring complications including postoperative arm problems and lasting lymphedema, according to Andreas Giannakou, MD, of Brigham and Women’s Hospital and the Dana-Farber Cancer Institute, Boston, the presenter of this new research.

The SOUND (Sentinel Node vs. Observation After Axillary Ultra-Sound) trial, published in JAMA Oncology in 2023, showed that ultrasound nodal imaging was a safe and effective alternative to SLNB in certain patients with early-stage breast cancers, but real-world validation was needed, Dr. Giannakou said during a press briefing in advance of the meeting.

Why Was the SOUND Trial Important?

The SOUND trial randomized 1,463 individuals with early stage (cT1NO) breast cancer (tumors less than 2 cm) and negative findings on axillary ultrasound to either SLNB or no axillary surgical staging.

The 5-year rate of distant disease-free survival was 97.7% in the SLNB group vs. 98% in the no axillary surgery group, suggesting that omission of staging was noninferior to SLNB in these patients and a safe and effective option.

In current practice, nodal status remains a key factor in decision-making for adjuvant systemic therapy in premenopausal patients and in patients with HER2+ and triple-negative breast cancer, Dr. Giannakou said during the press briefing.

“The SOUND trial is a potentially practice-changing study that can spare a specific patient population from axillary surgical staging,” Dr. Giannakou said in an interview. “Before broadly applying clinical trial results to practice, it is important to ensure that the trial population is representative of the population being treated in real world practice,” he said.

What Did the New Study Show? 

In the new study, the researchers identified 312 patients meeting the SOUND trial eligibility criteria in a large database from a single center, and compared disease characteristics and outcomes with the 708 patients in the SLNB arm of the SOUND trial.

The researchers found a similarly high rate of negative SLNB results and very low recurrence in the study population. Notably, only 11.3% of the patients in the current study and 13.1% of patients in the SOUND trial had 1-3 positive lymph nodes, and less than 1% of patients in both cohorts had 4 or more positive nodes, Dr. Giannakou said.

The population of the current study was similar to that of the SOUND trial population with respect to treatment characteristics and nodal disease burden,” Dr. Giannakou said during the interview. These findings suggest that omission of sentinel lymph node in the new study cohort would have also likely been oncologically safe.

“These results are confirmatory but not surprising,” he said. Previous studies have shown that the sensitivity and accuracy of axillary ultrasound is comparable to the sentinel lymph node biopsy in patients with early breast cancer and only one abnormal lymph node on the ultrasound. 
 

 

 

What Are the Clinical Implications?

The current study findings make an important contribution to the effort to de-escalate axillary surgery in early breast cancer, Dr. Giannakou said during the interview. Although SLNB is less morbid than axillary lymph node dissection, the lymphedema risk still exists, and identifying which patients actually benefit from SLNB is critical, he said.

“In our multidisciplinary team, we are working to define selection criteria for postmenopausal patients with HR+HER2- breast cancer who would have met eligibility criteria for the SOUND trial and for whom omission of SLNB would not change adjuvant treatment considerations,” he said.

“Breast surgeons have been moving towards less aggressive axillary surgery based on evidence showing its safety in specific patient cohorts, particularly those with low-risk factors such as older age (70 years and above) and early-stage hormone receptor-positive breast cancer,” Sarah Blair, MD, professor and vice chair in the department of surgery at UC San Diego Health, said in an interview.

“The Choosing Wisely recommendations, issued by the Society of Surgical Oncology, advise against routine use of sentinel lymph node biopsy in women aged 70 and older with early-stage hormone receptor–positive breast cancer; these recommendations are based on clinical trials demonstrating oncologic safety in this population,” said Dr. Blair, who was not involved in the SOUND trial or the current study.

The data from the new study are encouraging and highlight the generalizability of the SOUND results, Mediget Teshome, MD, chief of breast surgery at UCLA Health, said in an interview. The results help to define a low-risk group of patients for which sentinel node staging may be omitted, after multidisciplinary discussion to ensure that nodal staging will not impact adjuvant systemic therapy or radiation decision-making, said Dr. Teshome, who was not involved in the SOUND trial or the current study.
 

What Are the Limitations of the SOUND trial and the New Study?

The current study limitations included its design having been a retrospective review of a prospective database with selection bias, lack of standard criteria for preoperative axillary ultrasound, and the lack of SLNB for many patients older than 70 years based on the Choosing Wisely criteria, Dr. Giannakou said in the press briefing.

“Despite the evidence supporting axillary surgery de-escalation, it can be challenging for surgeons to change their practice based on a single study,” Dr. Blair said an interview. However, the SOUND trial findings support current evidence, giving surgeons more confidence to discuss multidisciplinary treatment options, she said.
 

What Additional Research is Needed?

“Longer follow-up is needed to make definitive conclusions about the oncologic outcomes of axillary surgery de-escalation in this patient population,” said Dr. Blair. “Given that slow-growing tumors are involved, the time to recurrence may extend beyond the typical follow-up period of three years.

“Ongoing research and collaboration among multidisciplinary teams are essential to ensure optimal treatment decisions and patient outcomes,” she emphasized.

Dr. Giannakou, Dr. Blair, and Dr. Teshome had no financial conflicts to disclose.

Publications
Topics
Sections

Ultrasound for assessing lymph nodal involvement may be substituted for sentinel lymph node biopsy with no change in outcomes in patients with early breast cancer, a new study finds.

This was the conclusion of research on the agenda at the American Society of Breast Surgeons annual meeting.

Sentinel lymph node biopsy (SLNB) is the standard of care for individuals with early-stage HR+HER2- breast cancer to assess nodal involvement, but SLNB can bring complications including postoperative arm problems and lasting lymphedema, according to Andreas Giannakou, MD, of Brigham and Women’s Hospital and the Dana-Farber Cancer Institute, Boston, the presenter of this new research.

The SOUND (Sentinel Node vs. Observation After Axillary Ultra-Sound) trial, published in JAMA Oncology in 2023, showed that ultrasound nodal imaging was a safe and effective alternative to SLNB in certain patients with early-stage breast cancers, but real-world validation was needed, Dr. Giannakou said during a press briefing in advance of the meeting.

Why Was the SOUND Trial Important?

The SOUND trial randomized 1,463 individuals with early stage (cT1NO) breast cancer (tumors less than 2 cm) and negative findings on axillary ultrasound to either SLNB or no axillary surgical staging.

The 5-year rate of distant disease-free survival was 97.7% in the SLNB group vs. 98% in the no axillary surgery group, suggesting that omission of staging was noninferior to SLNB in these patients and a safe and effective option.

In current practice, nodal status remains a key factor in decision-making for adjuvant systemic therapy in premenopausal patients and in patients with HER2+ and triple-negative breast cancer, Dr. Giannakou said during the press briefing.

“The SOUND trial is a potentially practice-changing study that can spare a specific patient population from axillary surgical staging,” Dr. Giannakou said in an interview. “Before broadly applying clinical trial results to practice, it is important to ensure that the trial population is representative of the population being treated in real world practice,” he said.

What Did the New Study Show? 

In the new study, the researchers identified 312 patients meeting the SOUND trial eligibility criteria in a large database from a single center, and compared disease characteristics and outcomes with the 708 patients in the SLNB arm of the SOUND trial.

The researchers found a similarly high rate of negative SLNB results and very low recurrence in the study population. Notably, only 11.3% of the patients in the current study and 13.1% of patients in the SOUND trial had 1-3 positive lymph nodes, and less than 1% of patients in both cohorts had 4 or more positive nodes, Dr. Giannakou said.

The population of the current study was similar to that of the SOUND trial population with respect to treatment characteristics and nodal disease burden,” Dr. Giannakou said during the interview. These findings suggest that omission of sentinel lymph node in the new study cohort would have also likely been oncologically safe.

“These results are confirmatory but not surprising,” he said. Previous studies have shown that the sensitivity and accuracy of axillary ultrasound is comparable to the sentinel lymph node biopsy in patients with early breast cancer and only one abnormal lymph node on the ultrasound. 
 

 

 

What Are the Clinical Implications?

The current study findings make an important contribution to the effort to de-escalate axillary surgery in early breast cancer, Dr. Giannakou said during the interview. Although SLNB is less morbid than axillary lymph node dissection, the lymphedema risk still exists, and identifying which patients actually benefit from SLNB is critical, he said.

“In our multidisciplinary team, we are working to define selection criteria for postmenopausal patients with HR+HER2- breast cancer who would have met eligibility criteria for the SOUND trial and for whom omission of SLNB would not change adjuvant treatment considerations,” he said.

“Breast surgeons have been moving towards less aggressive axillary surgery based on evidence showing its safety in specific patient cohorts, particularly those with low-risk factors such as older age (70 years and above) and early-stage hormone receptor-positive breast cancer,” Sarah Blair, MD, professor and vice chair in the department of surgery at UC San Diego Health, said in an interview.

“The Choosing Wisely recommendations, issued by the Society of Surgical Oncology, advise against routine use of sentinel lymph node biopsy in women aged 70 and older with early-stage hormone receptor–positive breast cancer; these recommendations are based on clinical trials demonstrating oncologic safety in this population,” said Dr. Blair, who was not involved in the SOUND trial or the current study.

The data from the new study are encouraging and highlight the generalizability of the SOUND results, Mediget Teshome, MD, chief of breast surgery at UCLA Health, said in an interview. The results help to define a low-risk group of patients for which sentinel node staging may be omitted, after multidisciplinary discussion to ensure that nodal staging will not impact adjuvant systemic therapy or radiation decision-making, said Dr. Teshome, who was not involved in the SOUND trial or the current study.
 

What Are the Limitations of the SOUND trial and the New Study?

The current study limitations included its design having been a retrospective review of a prospective database with selection bias, lack of standard criteria for preoperative axillary ultrasound, and the lack of SLNB for many patients older than 70 years based on the Choosing Wisely criteria, Dr. Giannakou said in the press briefing.

“Despite the evidence supporting axillary surgery de-escalation, it can be challenging for surgeons to change their practice based on a single study,” Dr. Blair said an interview. However, the SOUND trial findings support current evidence, giving surgeons more confidence to discuss multidisciplinary treatment options, she said.
 

What Additional Research is Needed?

“Longer follow-up is needed to make definitive conclusions about the oncologic outcomes of axillary surgery de-escalation in this patient population,” said Dr. Blair. “Given that slow-growing tumors are involved, the time to recurrence may extend beyond the typical follow-up period of three years.

“Ongoing research and collaboration among multidisciplinary teams are essential to ensure optimal treatment decisions and patient outcomes,” she emphasized.

Dr. Giannakou, Dr. Blair, and Dr. Teshome had no financial conflicts to disclose.

Ultrasound for assessing lymph nodal involvement may be substituted for sentinel lymph node biopsy with no change in outcomes in patients with early breast cancer, a new study finds.

This was the conclusion of research on the agenda at the American Society of Breast Surgeons annual meeting.

Sentinel lymph node biopsy (SLNB) is the standard of care for individuals with early-stage HR+HER2- breast cancer to assess nodal involvement, but SLNB can bring complications including postoperative arm problems and lasting lymphedema, according to Andreas Giannakou, MD, of Brigham and Women’s Hospital and the Dana-Farber Cancer Institute, Boston, the presenter of this new research.

The SOUND (Sentinel Node vs. Observation After Axillary Ultra-Sound) trial, published in JAMA Oncology in 2023, showed that ultrasound nodal imaging was a safe and effective alternative to SLNB in certain patients with early-stage breast cancers, but real-world validation was needed, Dr. Giannakou said during a press briefing in advance of the meeting.

Why Was the SOUND Trial Important?

The SOUND trial randomized 1,463 individuals with early stage (cT1NO) breast cancer (tumors less than 2 cm) and negative findings on axillary ultrasound to either SLNB or no axillary surgical staging.

The 5-year rate of distant disease-free survival was 97.7% in the SLNB group vs. 98% in the no axillary surgery group, suggesting that omission of staging was noninferior to SLNB in these patients and a safe and effective option.

In current practice, nodal status remains a key factor in decision-making for adjuvant systemic therapy in premenopausal patients and in patients with HER2+ and triple-negative breast cancer, Dr. Giannakou said during the press briefing.

“The SOUND trial is a potentially practice-changing study that can spare a specific patient population from axillary surgical staging,” Dr. Giannakou said in an interview. “Before broadly applying clinical trial results to practice, it is important to ensure that the trial population is representative of the population being treated in real world practice,” he said.

What Did the New Study Show? 

In the new study, the researchers identified 312 patients meeting the SOUND trial eligibility criteria in a large database from a single center, and compared disease characteristics and outcomes with the 708 patients in the SLNB arm of the SOUND trial.

The researchers found a similarly high rate of negative SLNB results and very low recurrence in the study population. Notably, only 11.3% of the patients in the current study and 13.1% of patients in the SOUND trial had 1-3 positive lymph nodes, and less than 1% of patients in both cohorts had 4 or more positive nodes, Dr. Giannakou said.

The population of the current study was similar to that of the SOUND trial population with respect to treatment characteristics and nodal disease burden,” Dr. Giannakou said during the interview. These findings suggest that omission of sentinel lymph node in the new study cohort would have also likely been oncologically safe.

“These results are confirmatory but not surprising,” he said. Previous studies have shown that the sensitivity and accuracy of axillary ultrasound is comparable to the sentinel lymph node biopsy in patients with early breast cancer and only one abnormal lymph node on the ultrasound. 
 

 

 

What Are the Clinical Implications?

The current study findings make an important contribution to the effort to de-escalate axillary surgery in early breast cancer, Dr. Giannakou said during the interview. Although SLNB is less morbid than axillary lymph node dissection, the lymphedema risk still exists, and identifying which patients actually benefit from SLNB is critical, he said.

“In our multidisciplinary team, we are working to define selection criteria for postmenopausal patients with HR+HER2- breast cancer who would have met eligibility criteria for the SOUND trial and for whom omission of SLNB would not change adjuvant treatment considerations,” he said.

“Breast surgeons have been moving towards less aggressive axillary surgery based on evidence showing its safety in specific patient cohorts, particularly those with low-risk factors such as older age (70 years and above) and early-stage hormone receptor-positive breast cancer,” Sarah Blair, MD, professor and vice chair in the department of surgery at UC San Diego Health, said in an interview.

“The Choosing Wisely recommendations, issued by the Society of Surgical Oncology, advise against routine use of sentinel lymph node biopsy in women aged 70 and older with early-stage hormone receptor–positive breast cancer; these recommendations are based on clinical trials demonstrating oncologic safety in this population,” said Dr. Blair, who was not involved in the SOUND trial or the current study.

The data from the new study are encouraging and highlight the generalizability of the SOUND results, Mediget Teshome, MD, chief of breast surgery at UCLA Health, said in an interview. The results help to define a low-risk group of patients for which sentinel node staging may be omitted, after multidisciplinary discussion to ensure that nodal staging will not impact adjuvant systemic therapy or radiation decision-making, said Dr. Teshome, who was not involved in the SOUND trial or the current study.
 

What Are the Limitations of the SOUND trial and the New Study?

The current study limitations included its design having been a retrospective review of a prospective database with selection bias, lack of standard criteria for preoperative axillary ultrasound, and the lack of SLNB for many patients older than 70 years based on the Choosing Wisely criteria, Dr. Giannakou said in the press briefing.

“Despite the evidence supporting axillary surgery de-escalation, it can be challenging for surgeons to change their practice based on a single study,” Dr. Blair said an interview. However, the SOUND trial findings support current evidence, giving surgeons more confidence to discuss multidisciplinary treatment options, she said.
 

What Additional Research is Needed?

“Longer follow-up is needed to make definitive conclusions about the oncologic outcomes of axillary surgery de-escalation in this patient population,” said Dr. Blair. “Given that slow-growing tumors are involved, the time to recurrence may extend beyond the typical follow-up period of three years.

“Ongoing research and collaboration among multidisciplinary teams are essential to ensure optimal treatment decisions and patient outcomes,” she emphasized.

Dr. Giannakou, Dr. Blair, and Dr. Teshome had no financial conflicts to disclose.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE ANNUAL MEETING OF THE AMERICAN SOCIETY OF BREAST SURGEONS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Safety Risks Persist with Out-of-Hospital Births

Article Type
Changed
Fri, 04/12/2024 - 16:55

Safety concerns persist for out-of-hospital births in the United States with multiple potential risk factors and few safety requirements, according to a paper published in the American Journal of Obstetrics and Gynecology.

In 2022, the Centers for Disease Control and Prevention (CDC) reported the highest number of planned home births in 30 years. The numbers rose 12% from 2020 to 2021, the latest period for which complete data are available. Home births rose from 45,646 (1.26% of births) in 2020 to 51,642 (1.41% of births).

Amos Grünebaum, MD, and Frank A. Chervenak, MD, with Northwell Health, and the Department of Obstetrics and Gynecology, Lenox Hill Hospital, Zucker School of Medicine in New Hyde Park, New York, reviewed the latest safety data surrounding community births in the United States along with well-known perinatal risks and safety requirements for safe out-of-hospital births.

“Most planned home births continue to have one or more risk factors that are associated with an increase in adverse pregnancy outcomes,” they wrote.
 

Birth Certificate Data Analyzed

The researchers used the CDC birth certificate database and analyzed deliveries between 2016 and 2022 regarding the incidence of perinatal risks in community births. The risks included were prior cesarean, first baby, mother older than 35 years, twins, breech presentation, gestational age of less than 37 weeks or more than 41 weeks, newborn weight over 4,000 grams, adequacy of prenatal care, grand multiparity (5 or more prior pregnancies), and a prepregnancy body mass index of at least 35.

The incidence of perinatal risks for out-of-hospital births ranged individually from 0.2% to 28.54% among birthing center births and 0.32% to 24.4% for planned home births.

“The ACOG committee opinion on home births states that for every 1000 home births, 3.9 babies will die,” the authors noted, or about twice the risk of hospital births. The deaths are “potentially avoidable with easy access to an operating room,” they wrote.

Among the safety concerns for perinatal morbidity and mortality in community births, the authors cited the lack of:

  • Appropriate patient selection for out-of-hospital births through standardized guidelines.
  • Availability of a Certified Nurse Midwife, a Certified Midwife, or midwife whose education and licensure meet International Confederation of Midwives’ (ICM) Global Standards for Midwifery Education.
  • Providers practicing obstetrics within an integrated and regulated health system with ready access and availability of board-certified obstetricians to provide consultation for qualified midwives.
  • Standardized guidelines on when transport to a hospital is necessary.

“While prerequisites for a safe out-of-hospital delivery may be in place in other high-income countries, these prerequisites have not been actualized in the United States,” the authors wrote.

Incorporating Patient Preferences Into Delivery Models

Yalda Afshar, MD, PhD, maternal-fetal medicine subspecialist and a physician-scientist at UCLA Health in California, said obstetricians are responsible for offering the most evidence-based care to pregnant people.

“What this birth certificate data demonstrates,” she said, “is a tendency among birthing people to opt for out-of-hospital births, despite documented risks to both the pregnant person and the neonate. This underscores the need to persist in educating on risk stratification, risk reduction, and safe birthing practices, while also fostering innovation. Innovation should stem from our commitment to incorporate the preferences of pregnant people into our healthcare delivery model.”

Dr. Afshar, who was not part of the study, said clinicians should develop innovative ways to effectively meet the needs of pregnant patients while ensuring their safety and well-being.

“Ideally, we would establish safe environments within hospital systems and centers that emulate home-like birthing experiences, thereby mitigating risks for these families,” she said.

Though not explicitly stated in the data, she added, it is crucial to emphasize the need for continuous risk assessment throughout pregnancy and childbirth, “with a paramount focus on the safety of the pregnant individual.”

The authors and Dr. Afshar have no relevant financial disclosures.

Publications
Topics
Sections

Safety concerns persist for out-of-hospital births in the United States with multiple potential risk factors and few safety requirements, according to a paper published in the American Journal of Obstetrics and Gynecology.

In 2022, the Centers for Disease Control and Prevention (CDC) reported the highest number of planned home births in 30 years. The numbers rose 12% from 2020 to 2021, the latest period for which complete data are available. Home births rose from 45,646 (1.26% of births) in 2020 to 51,642 (1.41% of births).

Amos Grünebaum, MD, and Frank A. Chervenak, MD, with Northwell Health, and the Department of Obstetrics and Gynecology, Lenox Hill Hospital, Zucker School of Medicine in New Hyde Park, New York, reviewed the latest safety data surrounding community births in the United States along with well-known perinatal risks and safety requirements for safe out-of-hospital births.

“Most planned home births continue to have one or more risk factors that are associated with an increase in adverse pregnancy outcomes,” they wrote.
 

Birth Certificate Data Analyzed

The researchers used the CDC birth certificate database and analyzed deliveries between 2016 and 2022 regarding the incidence of perinatal risks in community births. The risks included were prior cesarean, first baby, mother older than 35 years, twins, breech presentation, gestational age of less than 37 weeks or more than 41 weeks, newborn weight over 4,000 grams, adequacy of prenatal care, grand multiparity (5 or more prior pregnancies), and a prepregnancy body mass index of at least 35.

The incidence of perinatal risks for out-of-hospital births ranged individually from 0.2% to 28.54% among birthing center births and 0.32% to 24.4% for planned home births.

“The ACOG committee opinion on home births states that for every 1000 home births, 3.9 babies will die,” the authors noted, or about twice the risk of hospital births. The deaths are “potentially avoidable with easy access to an operating room,” they wrote.

Among the safety concerns for perinatal morbidity and mortality in community births, the authors cited the lack of:

  • Appropriate patient selection for out-of-hospital births through standardized guidelines.
  • Availability of a Certified Nurse Midwife, a Certified Midwife, or midwife whose education and licensure meet International Confederation of Midwives’ (ICM) Global Standards for Midwifery Education.
  • Providers practicing obstetrics within an integrated and regulated health system with ready access and availability of board-certified obstetricians to provide consultation for qualified midwives.
  • Standardized guidelines on when transport to a hospital is necessary.

“While prerequisites for a safe out-of-hospital delivery may be in place in other high-income countries, these prerequisites have not been actualized in the United States,” the authors wrote.

Incorporating Patient Preferences Into Delivery Models

Yalda Afshar, MD, PhD, maternal-fetal medicine subspecialist and a physician-scientist at UCLA Health in California, said obstetricians are responsible for offering the most evidence-based care to pregnant people.

“What this birth certificate data demonstrates,” she said, “is a tendency among birthing people to opt for out-of-hospital births, despite documented risks to both the pregnant person and the neonate. This underscores the need to persist in educating on risk stratification, risk reduction, and safe birthing practices, while also fostering innovation. Innovation should stem from our commitment to incorporate the preferences of pregnant people into our healthcare delivery model.”

Dr. Afshar, who was not part of the study, said clinicians should develop innovative ways to effectively meet the needs of pregnant patients while ensuring their safety and well-being.

“Ideally, we would establish safe environments within hospital systems and centers that emulate home-like birthing experiences, thereby mitigating risks for these families,” she said.

Though not explicitly stated in the data, she added, it is crucial to emphasize the need for continuous risk assessment throughout pregnancy and childbirth, “with a paramount focus on the safety of the pregnant individual.”

The authors and Dr. Afshar have no relevant financial disclosures.

Safety concerns persist for out-of-hospital births in the United States with multiple potential risk factors and few safety requirements, according to a paper published in the American Journal of Obstetrics and Gynecology.

In 2022, the Centers for Disease Control and Prevention (CDC) reported the highest number of planned home births in 30 years. The numbers rose 12% from 2020 to 2021, the latest period for which complete data are available. Home births rose from 45,646 (1.26% of births) in 2020 to 51,642 (1.41% of births).

Amos Grünebaum, MD, and Frank A. Chervenak, MD, with Northwell Health, and the Department of Obstetrics and Gynecology, Lenox Hill Hospital, Zucker School of Medicine in New Hyde Park, New York, reviewed the latest safety data surrounding community births in the United States along with well-known perinatal risks and safety requirements for safe out-of-hospital births.

“Most planned home births continue to have one or more risk factors that are associated with an increase in adverse pregnancy outcomes,” they wrote.
 

Birth Certificate Data Analyzed

The researchers used the CDC birth certificate database and analyzed deliveries between 2016 and 2022 regarding the incidence of perinatal risks in community births. The risks included were prior cesarean, first baby, mother older than 35 years, twins, breech presentation, gestational age of less than 37 weeks or more than 41 weeks, newborn weight over 4,000 grams, adequacy of prenatal care, grand multiparity (5 or more prior pregnancies), and a prepregnancy body mass index of at least 35.

The incidence of perinatal risks for out-of-hospital births ranged individually from 0.2% to 28.54% among birthing center births and 0.32% to 24.4% for planned home births.

“The ACOG committee opinion on home births states that for every 1000 home births, 3.9 babies will die,” the authors noted, or about twice the risk of hospital births. The deaths are “potentially avoidable with easy access to an operating room,” they wrote.

Among the safety concerns for perinatal morbidity and mortality in community births, the authors cited the lack of:

  • Appropriate patient selection for out-of-hospital births through standardized guidelines.
  • Availability of a Certified Nurse Midwife, a Certified Midwife, or midwife whose education and licensure meet International Confederation of Midwives’ (ICM) Global Standards for Midwifery Education.
  • Providers practicing obstetrics within an integrated and regulated health system with ready access and availability of board-certified obstetricians to provide consultation for qualified midwives.
  • Standardized guidelines on when transport to a hospital is necessary.

“While prerequisites for a safe out-of-hospital delivery may be in place in other high-income countries, these prerequisites have not been actualized in the United States,” the authors wrote.

Incorporating Patient Preferences Into Delivery Models

Yalda Afshar, MD, PhD, maternal-fetal medicine subspecialist and a physician-scientist at UCLA Health in California, said obstetricians are responsible for offering the most evidence-based care to pregnant people.

“What this birth certificate data demonstrates,” she said, “is a tendency among birthing people to opt for out-of-hospital births, despite documented risks to both the pregnant person and the neonate. This underscores the need to persist in educating on risk stratification, risk reduction, and safe birthing practices, while also fostering innovation. Innovation should stem from our commitment to incorporate the preferences of pregnant people into our healthcare delivery model.”

Dr. Afshar, who was not part of the study, said clinicians should develop innovative ways to effectively meet the needs of pregnant patients while ensuring their safety and well-being.

“Ideally, we would establish safe environments within hospital systems and centers that emulate home-like birthing experiences, thereby mitigating risks for these families,” she said.

Though not explicitly stated in the data, she added, it is crucial to emphasize the need for continuous risk assessment throughout pregnancy and childbirth, “with a paramount focus on the safety of the pregnant individual.”

The authors and Dr. Afshar have no relevant financial disclosures.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM AMERICAN JOURNAL OF OBSTETRICS AND GYNECOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Oncologists Voice Ethical Concerns Over AI in Cancer Care

Article Type
Changed
Mon, 04/15/2024 - 17:37

 

TOPLINE:

A recent survey highlighted ethical concerns US oncologists have about using artificial intelligence (AI) to help make cancer treatment decisions and revealed some contradictory views about how best to integrate these tools into practice. Most respondents, for instance, said patients should not be expected to understand how AI tools work, but many also felt patients could make treatment decisions based on AI-generated recommendations. Most oncologists also felt responsible for protecting patients from biased AI, but few were confident that they could do so.

METHODOLOGY:

  • The US Food and Drug Administration (FDA) has  for use in various medical specialties over the past few decades, and increasingly, AI tools are being integrated into cancer care.
  • However, the uptake of these tools in oncology has raised ethical questions and concerns, including challenges with AI bias, error, or misuse, as well as issues explaining how an AI model reached a result.
  • In the current study, researchers asked 204 oncologists from 37 states for their views on the ethical implications of using AI for cancer care.
  • Among the survey respondents, 64% were men and 63% were non-Hispanic White; 29% were from academic practices, 47% had received some education on AI use in healthcare, and 45% were familiar with clinical decision models.
  • The researchers assessed respondents’ answers to various questions, including whether to provide informed consent for AI use and how oncologists would approach a scenario where the AI model and the oncologist recommended a different treatment regimen.

TAKEAWAY:

  • Overall, 81% of oncologists supported having patient consent to use an AI model during treatment decisions, and 85% felt that oncologists needed to be able to explain an AI-based clinical decision model to use it in the clinic; however, only 23% felt that patients also needed to be able to explain an AI model.
  • When an AI decision model recommended a different treatment regimen than the treating oncologist, the most common response (36.8%) was to present both options to the patient and let the patient decide. Oncologists from academic settings were about 2.5 times more likely than those from other settings to let the patient decide. About 34% of respondents said they would present both options but recommend the oncologist’s regimen, whereas about 22% said they would present both but recommend the AI’s regimen. A small percentage would only present the oncologist’s regimen (5%) or the AI’s regimen (about 2.5%).
  • About three of four respondents (76.5%) agreed that oncologists should protect patients from biased AI tools; however, only about one of four (27.9%) felt confident they could identify biased AI models.
  • Most oncologists (91%) felt that AI developers were responsible for the medico-legal problems associated with AI use; less than half (47%) said oncologists or hospitals (43%) shared this responsibility.

IN PRACTICE:

“Together, these data characterize barriers that may impede the ethical adoption of AI into cancer care. The findings suggest that the implementation of AI in oncology must include rigorous assessments of its effect on care decisions, as well as decisional responsibility when problems related to AI use arise,” the authors concluded.

SOURCE:

The study, with first author Andrew Hantel, MD, from Dana-Farber Cancer Institute, Boston, was published last month in JAMA Network Open.

LIMITATIONS:

The study had a moderate sample size and response rate, although demographics of participating oncologists appear to be nationally representative. The cross-sectional study design limited the generalizability of the findings over time as AI is integrated into cancer care.

DISCLOSURES:

The study was funded by the National Cancer Institute, the Dana-Farber McGraw/Patterson Research Fund, and the Mark Foundation Emerging Leader Award. Dr. Hantel reported receiving personal fees from AbbVie, AstraZeneca, the American Journal of Managed Care, Genentech, and GSK.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

A recent survey highlighted ethical concerns US oncologists have about using artificial intelligence (AI) to help make cancer treatment decisions and revealed some contradictory views about how best to integrate these tools into practice. Most respondents, for instance, said patients should not be expected to understand how AI tools work, but many also felt patients could make treatment decisions based on AI-generated recommendations. Most oncologists also felt responsible for protecting patients from biased AI, but few were confident that they could do so.

METHODOLOGY:

  • The US Food and Drug Administration (FDA) has  for use in various medical specialties over the past few decades, and increasingly, AI tools are being integrated into cancer care.
  • However, the uptake of these tools in oncology has raised ethical questions and concerns, including challenges with AI bias, error, or misuse, as well as issues explaining how an AI model reached a result.
  • In the current study, researchers asked 204 oncologists from 37 states for their views on the ethical implications of using AI for cancer care.
  • Among the survey respondents, 64% were men and 63% were non-Hispanic White; 29% were from academic practices, 47% had received some education on AI use in healthcare, and 45% were familiar with clinical decision models.
  • The researchers assessed respondents’ answers to various questions, including whether to provide informed consent for AI use and how oncologists would approach a scenario where the AI model and the oncologist recommended a different treatment regimen.

TAKEAWAY:

  • Overall, 81% of oncologists supported having patient consent to use an AI model during treatment decisions, and 85% felt that oncologists needed to be able to explain an AI-based clinical decision model to use it in the clinic; however, only 23% felt that patients also needed to be able to explain an AI model.
  • When an AI decision model recommended a different treatment regimen than the treating oncologist, the most common response (36.8%) was to present both options to the patient and let the patient decide. Oncologists from academic settings were about 2.5 times more likely than those from other settings to let the patient decide. About 34% of respondents said they would present both options but recommend the oncologist’s regimen, whereas about 22% said they would present both but recommend the AI’s regimen. A small percentage would only present the oncologist’s regimen (5%) or the AI’s regimen (about 2.5%).
  • About three of four respondents (76.5%) agreed that oncologists should protect patients from biased AI tools; however, only about one of four (27.9%) felt confident they could identify biased AI models.
  • Most oncologists (91%) felt that AI developers were responsible for the medico-legal problems associated with AI use; less than half (47%) said oncologists or hospitals (43%) shared this responsibility.

IN PRACTICE:

“Together, these data characterize barriers that may impede the ethical adoption of AI into cancer care. The findings suggest that the implementation of AI in oncology must include rigorous assessments of its effect on care decisions, as well as decisional responsibility when problems related to AI use arise,” the authors concluded.

SOURCE:

The study, with first author Andrew Hantel, MD, from Dana-Farber Cancer Institute, Boston, was published last month in JAMA Network Open.

LIMITATIONS:

The study had a moderate sample size and response rate, although demographics of participating oncologists appear to be nationally representative. The cross-sectional study design limited the generalizability of the findings over time as AI is integrated into cancer care.

DISCLOSURES:

The study was funded by the National Cancer Institute, the Dana-Farber McGraw/Patterson Research Fund, and the Mark Foundation Emerging Leader Award. Dr. Hantel reported receiving personal fees from AbbVie, AstraZeneca, the American Journal of Managed Care, Genentech, and GSK.

A version of this article appeared on Medscape.com.

 

TOPLINE:

A recent survey highlighted ethical concerns US oncologists have about using artificial intelligence (AI) to help make cancer treatment decisions and revealed some contradictory views about how best to integrate these tools into practice. Most respondents, for instance, said patients should not be expected to understand how AI tools work, but many also felt patients could make treatment decisions based on AI-generated recommendations. Most oncologists also felt responsible for protecting patients from biased AI, but few were confident that they could do so.

METHODOLOGY:

  • The US Food and Drug Administration (FDA) has  for use in various medical specialties over the past few decades, and increasingly, AI tools are being integrated into cancer care.
  • However, the uptake of these tools in oncology has raised ethical questions and concerns, including challenges with AI bias, error, or misuse, as well as issues explaining how an AI model reached a result.
  • In the current study, researchers asked 204 oncologists from 37 states for their views on the ethical implications of using AI for cancer care.
  • Among the survey respondents, 64% were men and 63% were non-Hispanic White; 29% were from academic practices, 47% had received some education on AI use in healthcare, and 45% were familiar with clinical decision models.
  • The researchers assessed respondents’ answers to various questions, including whether to provide informed consent for AI use and how oncologists would approach a scenario where the AI model and the oncologist recommended a different treatment regimen.

TAKEAWAY:

  • Overall, 81% of oncologists supported having patient consent to use an AI model during treatment decisions, and 85% felt that oncologists needed to be able to explain an AI-based clinical decision model to use it in the clinic; however, only 23% felt that patients also needed to be able to explain an AI model.
  • When an AI decision model recommended a different treatment regimen than the treating oncologist, the most common response (36.8%) was to present both options to the patient and let the patient decide. Oncologists from academic settings were about 2.5 times more likely than those from other settings to let the patient decide. About 34% of respondents said they would present both options but recommend the oncologist’s regimen, whereas about 22% said they would present both but recommend the AI’s regimen. A small percentage would only present the oncologist’s regimen (5%) or the AI’s regimen (about 2.5%).
  • About three of four respondents (76.5%) agreed that oncologists should protect patients from biased AI tools; however, only about one of four (27.9%) felt confident they could identify biased AI models.
  • Most oncologists (91%) felt that AI developers were responsible for the medico-legal problems associated with AI use; less than half (47%) said oncologists or hospitals (43%) shared this responsibility.

IN PRACTICE:

“Together, these data characterize barriers that may impede the ethical adoption of AI into cancer care. The findings suggest that the implementation of AI in oncology must include rigorous assessments of its effect on care decisions, as well as decisional responsibility when problems related to AI use arise,” the authors concluded.

SOURCE:

The study, with first author Andrew Hantel, MD, from Dana-Farber Cancer Institute, Boston, was published last month in JAMA Network Open.

LIMITATIONS:

The study had a moderate sample size and response rate, although demographics of participating oncologists appear to be nationally representative. The cross-sectional study design limited the generalizability of the findings over time as AI is integrated into cancer care.

DISCLOSURES:

The study was funded by the National Cancer Institute, the Dana-Farber McGraw/Patterson Research Fund, and the Mark Foundation Emerging Leader Award. Dr. Hantel reported receiving personal fees from AbbVie, AstraZeneca, the American Journal of Managed Care, Genentech, and GSK.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Further Support for CRC Screening to Start at Age 45: Meta-Analysis

Article Type
Changed
Mon, 04/15/2024 - 11:35

 

TOPLINE:

For individuals aged 45-49 years at average risk for colorectal cancer (CRC), the adenoma detection rate (ADR) in screening colonoscopies is 28%, which is comparable with rates seen in those aged 50-54 years.

METHODOLOGY:

  • The rising incidence of CRC in younger populations prompted most guidelines to recommend screening to start at age 45. The impact of lowering the screening age on adenoma and sessile serrated lesion detection rates remains unclear, however.
  • Researchers conducted a systematic review and meta-analysis of 16 studies; all studies were retrospective except one.
  • Patients aged 45-49 years undergoing colonoscopy for any indication were included, with a separate analysis of patients in that age group at average CRC risk undergoing screening colonoscopies.
  • The primary outcome was the overall detection rates of adenomas and sessile serrated lesions for colonoscopies performed for any indication.

TAKEAWAY:

  • Across 15 studies, 41,709 adenomas were detected in 150,436 colonoscopies performed for any indication, resulting in a pooled overall ADR of 23.1%.
  • Across six studies, 1162 sessile serrated lesions were reported in 11,457 colonoscopies performed for any indication, with a pooled detection rate of 6.3%.
  • Across seven studies, the pooled ADR in screening colonoscopies performed on individuals with average CRC risk was 28.2%, which is comparable with that of 50- to 54-year-old individuals undergoing screening colonoscopy. There was not enough data to calculate the sessile serrated lesion detection rate in average-risk patients.
  • The ADR was higher in the United States and Canada (26.1%) compared with studies from Asia (16.9%).

IN PRACTICE:

“The comparable detection rates of precancerous lesions in this age group to those 50 to 54 years old support starting CRC screening at 45 years of age,” the authors wrote.

SOURCE:

This study, led by Mohamed Abdallah, MD, Division of Gastroenterology and Hepatology, University of Minnesota Medical Center, Minneapolis, was published online in The American Journal of Gastroenterology.

LIMITATIONS:

The inclusion of retrospective studies has an inherent bias. The heterogeneity between studies may limit the generalizability of the findings. Some studies that reported detection rates included individuals at both average and high risk for CRC, so they could not be used to evaluate ADRs in individuals with an average risk for CRC. Data duplication could not be ruled out.

DISCLOSURES:

The study did not receive any funding. The authors declared no conflicts of interest.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

For individuals aged 45-49 years at average risk for colorectal cancer (CRC), the adenoma detection rate (ADR) in screening colonoscopies is 28%, which is comparable with rates seen in those aged 50-54 years.

METHODOLOGY:

  • The rising incidence of CRC in younger populations prompted most guidelines to recommend screening to start at age 45. The impact of lowering the screening age on adenoma and sessile serrated lesion detection rates remains unclear, however.
  • Researchers conducted a systematic review and meta-analysis of 16 studies; all studies were retrospective except one.
  • Patients aged 45-49 years undergoing colonoscopy for any indication were included, with a separate analysis of patients in that age group at average CRC risk undergoing screening colonoscopies.
  • The primary outcome was the overall detection rates of adenomas and sessile serrated lesions for colonoscopies performed for any indication.

TAKEAWAY:

  • Across 15 studies, 41,709 adenomas were detected in 150,436 colonoscopies performed for any indication, resulting in a pooled overall ADR of 23.1%.
  • Across six studies, 1162 sessile serrated lesions were reported in 11,457 colonoscopies performed for any indication, with a pooled detection rate of 6.3%.
  • Across seven studies, the pooled ADR in screening colonoscopies performed on individuals with average CRC risk was 28.2%, which is comparable with that of 50- to 54-year-old individuals undergoing screening colonoscopy. There was not enough data to calculate the sessile serrated lesion detection rate in average-risk patients.
  • The ADR was higher in the United States and Canada (26.1%) compared with studies from Asia (16.9%).

IN PRACTICE:

“The comparable detection rates of precancerous lesions in this age group to those 50 to 54 years old support starting CRC screening at 45 years of age,” the authors wrote.

SOURCE:

This study, led by Mohamed Abdallah, MD, Division of Gastroenterology and Hepatology, University of Minnesota Medical Center, Minneapolis, was published online in The American Journal of Gastroenterology.

LIMITATIONS:

The inclusion of retrospective studies has an inherent bias. The heterogeneity between studies may limit the generalizability of the findings. Some studies that reported detection rates included individuals at both average and high risk for CRC, so they could not be used to evaluate ADRs in individuals with an average risk for CRC. Data duplication could not be ruled out.

DISCLOSURES:

The study did not receive any funding. The authors declared no conflicts of interest.

A version of this article appeared on Medscape.com.

 

TOPLINE:

For individuals aged 45-49 years at average risk for colorectal cancer (CRC), the adenoma detection rate (ADR) in screening colonoscopies is 28%, which is comparable with rates seen in those aged 50-54 years.

METHODOLOGY:

  • The rising incidence of CRC in younger populations prompted most guidelines to recommend screening to start at age 45. The impact of lowering the screening age on adenoma and sessile serrated lesion detection rates remains unclear, however.
  • Researchers conducted a systematic review and meta-analysis of 16 studies; all studies were retrospective except one.
  • Patients aged 45-49 years undergoing colonoscopy for any indication were included, with a separate analysis of patients in that age group at average CRC risk undergoing screening colonoscopies.
  • The primary outcome was the overall detection rates of adenomas and sessile serrated lesions for colonoscopies performed for any indication.

TAKEAWAY:

  • Across 15 studies, 41,709 adenomas were detected in 150,436 colonoscopies performed for any indication, resulting in a pooled overall ADR of 23.1%.
  • Across six studies, 1162 sessile serrated lesions were reported in 11,457 colonoscopies performed for any indication, with a pooled detection rate of 6.3%.
  • Across seven studies, the pooled ADR in screening colonoscopies performed on individuals with average CRC risk was 28.2%, which is comparable with that of 50- to 54-year-old individuals undergoing screening colonoscopy. There was not enough data to calculate the sessile serrated lesion detection rate in average-risk patients.
  • The ADR was higher in the United States and Canada (26.1%) compared with studies from Asia (16.9%).

IN PRACTICE:

“The comparable detection rates of precancerous lesions in this age group to those 50 to 54 years old support starting CRC screening at 45 years of age,” the authors wrote.

SOURCE:

This study, led by Mohamed Abdallah, MD, Division of Gastroenterology and Hepatology, University of Minnesota Medical Center, Minneapolis, was published online in The American Journal of Gastroenterology.

LIMITATIONS:

The inclusion of retrospective studies has an inherent bias. The heterogeneity between studies may limit the generalizability of the findings. Some studies that reported detection rates included individuals at both average and high risk for CRC, so they could not be used to evaluate ADRs in individuals with an average risk for CRC. Data duplication could not be ruled out.

DISCLOSURES:

The study did not receive any funding. The authors declared no conflicts of interest.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

GLP-1 Receptor Agonists Don’t Raise Thyroid Cancer Risk

Article Type
Changed
Mon, 04/15/2024 - 09:24

 

TOPLINE:

No significant association was found between the use of glucagon-like peptide 1 receptor agonists (GLP-1 RAs) and thyroid cancer over nearly 4 years.

METHODOLOGY:

  • A cohort study using data from nationwide registers in Denmark, Norway, and Sweden between 2007 and 2021 included 145,410 patients who initiated GLP-1 RAs and 291,667 propensity score-matched patients initiating dipeptidyl peptidase 4 (DPP4) inhibitors as active comparators.
  • Additional analysis included 111,744 who initiated GLP-1 RAs and 148,179 patients initiating sodium-glucose cotransporter 2 (SGLT2) inhibitors.
  • Overall, mean follow-up time was 3.9 years, with 25% followed for more than 6 years.

TAKEAWAY:

  • The most common individual GLP-1 RAs were liraglutide (57.3%) and semaglutide (32.9%).
  • During follow-up, there were 76 incident thyroid cancer cases among GLP-1 RA users and 184 cases in DPP4 inhibitor users, giving incidence rates per 10,000 of 1.33 and 1.46, respectively, a nonsignificant difference (hazard ratio [HR], 0.93; 95% CI, 0.66-1.31).
  • Papillary thyroid cancer was the most common thyroid cancer subtype, followed by follicular and medullary, with no significant increases in risk with GLP-1 RAs by cancer type, although the numbers were small.
  • In the SGLT2 inhibitor comparison, there was also no significantly increased thyroid cancer risk for GLP-1 RAs (HR, 1.16; 95% CI, 0.65-2.05).

IN PRACTICE:

“Given the upper limit of the confidence interval, the findings are incompatible with more than a 31% increased relative risk of thyroid cancer. In absolute terms, this translates to no more than 0.36 excess cases per 10 000 person-years, a figure that should be interpreted against the background incidence of 1.46 per 10,000 person-years among the comparator group in the study populations.”

SOURCE:

This study was conducted by Björn Pasternak, MD, PhD, of the Karolinska Institutet, Stockholm, and colleagues. It was published online on April 10, 2024, in The BMJ.

LIMITATIONS:

Relatively short follow-up for cancer risk. Risk by individual GLP-1 RA not analyzed. Small event numbers. Observational, with potential for residual confounding and time-release bias.

DISCLOSURES:

The study was supported by grants from the Swedish Cancer Society and the Swedish Research Council. Dr. Pasternak was supported by a consolidator investigator grant from Karolinska Institutet. Some of the coauthors had industry disclosures.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

No significant association was found between the use of glucagon-like peptide 1 receptor agonists (GLP-1 RAs) and thyroid cancer over nearly 4 years.

METHODOLOGY:

  • A cohort study using data from nationwide registers in Denmark, Norway, and Sweden between 2007 and 2021 included 145,410 patients who initiated GLP-1 RAs and 291,667 propensity score-matched patients initiating dipeptidyl peptidase 4 (DPP4) inhibitors as active comparators.
  • Additional analysis included 111,744 who initiated GLP-1 RAs and 148,179 patients initiating sodium-glucose cotransporter 2 (SGLT2) inhibitors.
  • Overall, mean follow-up time was 3.9 years, with 25% followed for more than 6 years.

TAKEAWAY:

  • The most common individual GLP-1 RAs were liraglutide (57.3%) and semaglutide (32.9%).
  • During follow-up, there were 76 incident thyroid cancer cases among GLP-1 RA users and 184 cases in DPP4 inhibitor users, giving incidence rates per 10,000 of 1.33 and 1.46, respectively, a nonsignificant difference (hazard ratio [HR], 0.93; 95% CI, 0.66-1.31).
  • Papillary thyroid cancer was the most common thyroid cancer subtype, followed by follicular and medullary, with no significant increases in risk with GLP-1 RAs by cancer type, although the numbers were small.
  • In the SGLT2 inhibitor comparison, there was also no significantly increased thyroid cancer risk for GLP-1 RAs (HR, 1.16; 95% CI, 0.65-2.05).

IN PRACTICE:

“Given the upper limit of the confidence interval, the findings are incompatible with more than a 31% increased relative risk of thyroid cancer. In absolute terms, this translates to no more than 0.36 excess cases per 10 000 person-years, a figure that should be interpreted against the background incidence of 1.46 per 10,000 person-years among the comparator group in the study populations.”

SOURCE:

This study was conducted by Björn Pasternak, MD, PhD, of the Karolinska Institutet, Stockholm, and colleagues. It was published online on April 10, 2024, in The BMJ.

LIMITATIONS:

Relatively short follow-up for cancer risk. Risk by individual GLP-1 RA not analyzed. Small event numbers. Observational, with potential for residual confounding and time-release bias.

DISCLOSURES:

The study was supported by grants from the Swedish Cancer Society and the Swedish Research Council. Dr. Pasternak was supported by a consolidator investigator grant from Karolinska Institutet. Some of the coauthors had industry disclosures.

A version of this article appeared on Medscape.com.

 

TOPLINE:

No significant association was found between the use of glucagon-like peptide 1 receptor agonists (GLP-1 RAs) and thyroid cancer over nearly 4 years.

METHODOLOGY:

  • A cohort study using data from nationwide registers in Denmark, Norway, and Sweden between 2007 and 2021 included 145,410 patients who initiated GLP-1 RAs and 291,667 propensity score-matched patients initiating dipeptidyl peptidase 4 (DPP4) inhibitors as active comparators.
  • Additional analysis included 111,744 who initiated GLP-1 RAs and 148,179 patients initiating sodium-glucose cotransporter 2 (SGLT2) inhibitors.
  • Overall, mean follow-up time was 3.9 years, with 25% followed for more than 6 years.

TAKEAWAY:

  • The most common individual GLP-1 RAs were liraglutide (57.3%) and semaglutide (32.9%).
  • During follow-up, there were 76 incident thyroid cancer cases among GLP-1 RA users and 184 cases in DPP4 inhibitor users, giving incidence rates per 10,000 of 1.33 and 1.46, respectively, a nonsignificant difference (hazard ratio [HR], 0.93; 95% CI, 0.66-1.31).
  • Papillary thyroid cancer was the most common thyroid cancer subtype, followed by follicular and medullary, with no significant increases in risk with GLP-1 RAs by cancer type, although the numbers were small.
  • In the SGLT2 inhibitor comparison, there was also no significantly increased thyroid cancer risk for GLP-1 RAs (HR, 1.16; 95% CI, 0.65-2.05).

IN PRACTICE:

“Given the upper limit of the confidence interval, the findings are incompatible with more than a 31% increased relative risk of thyroid cancer. In absolute terms, this translates to no more than 0.36 excess cases per 10 000 person-years, a figure that should be interpreted against the background incidence of 1.46 per 10,000 person-years among the comparator group in the study populations.”

SOURCE:

This study was conducted by Björn Pasternak, MD, PhD, of the Karolinska Institutet, Stockholm, and colleagues. It was published online on April 10, 2024, in The BMJ.

LIMITATIONS:

Relatively short follow-up for cancer risk. Risk by individual GLP-1 RA not analyzed. Small event numbers. Observational, with potential for residual confounding and time-release bias.

DISCLOSURES:

The study was supported by grants from the Swedish Cancer Society and the Swedish Research Council. Dr. Pasternak was supported by a consolidator investigator grant from Karolinska Institutet. Some of the coauthors had industry disclosures.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

No Major Differences in Improvement Seen with Progressive Resistance Training Versus Neuromuscular Exercise for Hip Osteoarthritis

Article Type
Changed
Fri, 04/12/2024 - 15:28

Progressive resistance training (PRT) and neuromuscular exercise (NEMEX) programs result in similar improvements in hip function, pain, and hip-related quality of life (QOL) in people with osteoarthritis (OA), according to the results of a randomized controlled trial.

At the end of the 12-week exercise period, both interventions yielded changes from baseline on the 30-second chair stand test (30s-CST) that were below the threshold for a major clinical effect. 

Mean changes in the Hip Disability and Osteoarthritis Outcome Score (HOOS) pain subscale and HOOS QOL score were also similar among the participants, regardless of which exercise program they had been assigned to.

“The lack of superiority of PRT for increasing muscle strength and power is surprising given the principle of specificity (higher-intensity resistance training yields greater improvements in maximal muscle strength),” according to the Danish researchers who reported the results online today in Annals of Internal Medicine.

“However, the point estimates only showed modest and uncertain superiority of PRT for increasing muscle strength and power and no differences for any functional performance tests or self-reported physical function,” they added.
 

The Power of Exercise

Worldwide, most clinical guidelines recommend exercise as a first-line conservative treatment option in both hip and knee OA. However, there is not much evidence to help guide healthcare practitioners in deciding which type of exercises to use with their patients, Troels Kjeldsen, MSc, the principal investigator for the study, told this news organization.

“Neuromuscular exercise is a very commonly used exercise program in clinical practice, but, to our knowledge, it has never been compared to another type of exercise in hip OA,” observed Mr. Kjeldsen, who is a PhD student in the department of orthopedic surgery at Aarhus University Hospital, Aarhus, Denmark.

“Each year, many thousands of patients are referred to having neuromuscular exercise therapy with a physiotherapist,” Mr. Kjeldsen said. “So, we thought it would be worthwhile to compare it to PRT, another promising exercise type, to see if it really did perform as well as I think most people thought it did,” he added.
 

Comparing the Two Exercise Programs

PRT and NEMEX are two different types of exercise programs. PRT involves using resistance-training machines, and the focus is to maximize the exercise intensity by using as high an exercise load or weight as possible. By contrast, NEMEX consists of exercises that are low to moderate in intensity and emphasizes alignment, control, and stability of the movements.

To compare the two exercise strategies, Mr. Kjeldsen and fellow investigators recruited 160 participants at five hospitals and 10 physiotherapy clinics across three of five healthcare regions in Denmark.

For inclusion in the trial, the participants had to have a clinical diagnosis of hip OA, be older than 45 years, and experience pain during activity in one or both hips that was rated as 3 or higher on a 10-point numerical rating scale. Participants also had to have no or less than 30 minutes of hip joint stiffness in the morning as well as no surgery involving the lower extremities in the previous 6 months. 

Participants were then randomized to undertake the PRT (n = 82) or NEMEX (n = 78) program, delivered as two physiotherapist-led group sessions every week for 12 weeks. Exercise sessions were held at least 72 hours apart and consisted of a 10-minute warm-up on an exercise bike and then 50 minutes of PRT or NEMEX. PRT consisted of five generic resistance-based exercises targeting hip and knee joint muscles and NEMEX consisted of 10 exercises that increased in difficulty by varying the number, direction, speed, and surface of the movements performed.
 

 

 

Dead Heat Between PRT and NEMEX

The primary endpoint was the 30s-CST, which counted the number of times participants could stand from a seated position in 30 seconds. Participants in the PRT and NEMEX groups were able to do this maneuver a respective 11.3 and 11.6 times at baseline and 12.8 and 13.1 times after completion of the exercise programs. 

Other functional performance tests included a 40-m fast-paced walk, a nine-step timed stair climb, leg extensor power in the affected and unaffected limb, and a unilateral single repetition leg press. None of these showed a statistically significant benefit of PRT over NEMEX, or vice versa.

HOOS pain scores at baseline and 12 weeks for PRT were a respective 57.5 and 66.1, representing an overall 8.6-point increase, and for NEMEX they were 58.9 and 68.2, giving a 9.3-point increase, meaning there was only a -0.7 mean change when comparing the two groups.

Corresponding baseline and 12-week HOOS QOL scores for PRT were 43.7 and 51.7; for NEMEX, they were 47.1 and 52.8 thus giving 8.0- and 5.7-point increases and a 2.3 difference in change between the groups. Again, this wasn’t quite enough to show a clinically meaningful effect.
 

Future Steps

“The effect of exercise seems to be at its highest at 3-4 months when you implement exercise, so we compared the effects of the exercises at the time when they are probably going to be at their highest,” Mr. Kjeldsen explained. He said the research team also plans to look at what happens after 1 year of follow-up.

“The key take home message is that patients can be encouraged to pick the type of exercise that they find the most enjoyable, or the type that is available to them,” Mr. Kjeldsen suggested. 

Stephanie Chang, MD, MPH, who is the Deputy Editor of Annals of Internal Medicine and practices in Rockville, Maryland, commented on the paper to this news organization. “In this small study, we learned that exercises to strengthen lower extremity muscles did not improve pain or function any more than exercises for core stability and balance,” she said.

Dr. Chang pointed out that there was variance in the levels of activity that people already undertook at baseline: 40% of the PRT group and 41% of the NEMEX group already did 150 minutes or more of moderate intensity physical activity. 

“It’s possible that benefit or differences between interventions would be greater in people with different levels of baseline activity or even in those with different osteoarthritis severity,” she said. 

“In the meantime,” Dr. Chang added, “with the findings from this study, I would feel comfortable advising my patients with hip osteoarthritis to engage in whichever type of exercise they prefer — whether that exercise focuses on core strengthening and balance or on specific lower extremity muscle strengthening.”

The trial was funded by the Independent Research Fund Denmark, the Physiotherapy Practice Foundation, the Health Foundation, Aarhus University, Region Zealand, the Association of Danish Physiotherapists, Andelsfonden, and Hede Nielsens Family Foundation. Mr. Kjeldsen and Dr. Chang report no relevant financial relationships.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Progressive resistance training (PRT) and neuromuscular exercise (NEMEX) programs result in similar improvements in hip function, pain, and hip-related quality of life (QOL) in people with osteoarthritis (OA), according to the results of a randomized controlled trial.

At the end of the 12-week exercise period, both interventions yielded changes from baseline on the 30-second chair stand test (30s-CST) that were below the threshold for a major clinical effect. 

Mean changes in the Hip Disability and Osteoarthritis Outcome Score (HOOS) pain subscale and HOOS QOL score were also similar among the participants, regardless of which exercise program they had been assigned to.

“The lack of superiority of PRT for increasing muscle strength and power is surprising given the principle of specificity (higher-intensity resistance training yields greater improvements in maximal muscle strength),” according to the Danish researchers who reported the results online today in Annals of Internal Medicine.

“However, the point estimates only showed modest and uncertain superiority of PRT for increasing muscle strength and power and no differences for any functional performance tests or self-reported physical function,” they added.
 

The Power of Exercise

Worldwide, most clinical guidelines recommend exercise as a first-line conservative treatment option in both hip and knee OA. However, there is not much evidence to help guide healthcare practitioners in deciding which type of exercises to use with their patients, Troels Kjeldsen, MSc, the principal investigator for the study, told this news organization.

“Neuromuscular exercise is a very commonly used exercise program in clinical practice, but, to our knowledge, it has never been compared to another type of exercise in hip OA,” observed Mr. Kjeldsen, who is a PhD student in the department of orthopedic surgery at Aarhus University Hospital, Aarhus, Denmark.

“Each year, many thousands of patients are referred to having neuromuscular exercise therapy with a physiotherapist,” Mr. Kjeldsen said. “So, we thought it would be worthwhile to compare it to PRT, another promising exercise type, to see if it really did perform as well as I think most people thought it did,” he added.
 

Comparing the Two Exercise Programs

PRT and NEMEX are two different types of exercise programs. PRT involves using resistance-training machines, and the focus is to maximize the exercise intensity by using as high an exercise load or weight as possible. By contrast, NEMEX consists of exercises that are low to moderate in intensity and emphasizes alignment, control, and stability of the movements.

To compare the two exercise strategies, Mr. Kjeldsen and fellow investigators recruited 160 participants at five hospitals and 10 physiotherapy clinics across three of five healthcare regions in Denmark.

For inclusion in the trial, the participants had to have a clinical diagnosis of hip OA, be older than 45 years, and experience pain during activity in one or both hips that was rated as 3 or higher on a 10-point numerical rating scale. Participants also had to have no or less than 30 minutes of hip joint stiffness in the morning as well as no surgery involving the lower extremities in the previous 6 months. 

Participants were then randomized to undertake the PRT (n = 82) or NEMEX (n = 78) program, delivered as two physiotherapist-led group sessions every week for 12 weeks. Exercise sessions were held at least 72 hours apart and consisted of a 10-minute warm-up on an exercise bike and then 50 minutes of PRT or NEMEX. PRT consisted of five generic resistance-based exercises targeting hip and knee joint muscles and NEMEX consisted of 10 exercises that increased in difficulty by varying the number, direction, speed, and surface of the movements performed.
 

 

 

Dead Heat Between PRT and NEMEX

The primary endpoint was the 30s-CST, which counted the number of times participants could stand from a seated position in 30 seconds. Participants in the PRT and NEMEX groups were able to do this maneuver a respective 11.3 and 11.6 times at baseline and 12.8 and 13.1 times after completion of the exercise programs. 

Other functional performance tests included a 40-m fast-paced walk, a nine-step timed stair climb, leg extensor power in the affected and unaffected limb, and a unilateral single repetition leg press. None of these showed a statistically significant benefit of PRT over NEMEX, or vice versa.

HOOS pain scores at baseline and 12 weeks for PRT were a respective 57.5 and 66.1, representing an overall 8.6-point increase, and for NEMEX they were 58.9 and 68.2, giving a 9.3-point increase, meaning there was only a -0.7 mean change when comparing the two groups.

Corresponding baseline and 12-week HOOS QOL scores for PRT were 43.7 and 51.7; for NEMEX, they were 47.1 and 52.8 thus giving 8.0- and 5.7-point increases and a 2.3 difference in change between the groups. Again, this wasn’t quite enough to show a clinically meaningful effect.
 

Future Steps

“The effect of exercise seems to be at its highest at 3-4 months when you implement exercise, so we compared the effects of the exercises at the time when they are probably going to be at their highest,” Mr. Kjeldsen explained. He said the research team also plans to look at what happens after 1 year of follow-up.

“The key take home message is that patients can be encouraged to pick the type of exercise that they find the most enjoyable, or the type that is available to them,” Mr. Kjeldsen suggested. 

Stephanie Chang, MD, MPH, who is the Deputy Editor of Annals of Internal Medicine and practices in Rockville, Maryland, commented on the paper to this news organization. “In this small study, we learned that exercises to strengthen lower extremity muscles did not improve pain or function any more than exercises for core stability and balance,” she said.

Dr. Chang pointed out that there was variance in the levels of activity that people already undertook at baseline: 40% of the PRT group and 41% of the NEMEX group already did 150 minutes or more of moderate intensity physical activity. 

“It’s possible that benefit or differences between interventions would be greater in people with different levels of baseline activity or even in those with different osteoarthritis severity,” she said. 

“In the meantime,” Dr. Chang added, “with the findings from this study, I would feel comfortable advising my patients with hip osteoarthritis to engage in whichever type of exercise they prefer — whether that exercise focuses on core strengthening and balance or on specific lower extremity muscle strengthening.”

The trial was funded by the Independent Research Fund Denmark, the Physiotherapy Practice Foundation, the Health Foundation, Aarhus University, Region Zealand, the Association of Danish Physiotherapists, Andelsfonden, and Hede Nielsens Family Foundation. Mr. Kjeldsen and Dr. Chang report no relevant financial relationships.

A version of this article appeared on Medscape.com.

Progressive resistance training (PRT) and neuromuscular exercise (NEMEX) programs result in similar improvements in hip function, pain, and hip-related quality of life (QOL) in people with osteoarthritis (OA), according to the results of a randomized controlled trial.

At the end of the 12-week exercise period, both interventions yielded changes from baseline on the 30-second chair stand test (30s-CST) that were below the threshold for a major clinical effect. 

Mean changes in the Hip Disability and Osteoarthritis Outcome Score (HOOS) pain subscale and HOOS QOL score were also similar among the participants, regardless of which exercise program they had been assigned to.

“The lack of superiority of PRT for increasing muscle strength and power is surprising given the principle of specificity (higher-intensity resistance training yields greater improvements in maximal muscle strength),” according to the Danish researchers who reported the results online today in Annals of Internal Medicine.

“However, the point estimates only showed modest and uncertain superiority of PRT for increasing muscle strength and power and no differences for any functional performance tests or self-reported physical function,” they added.
 

The Power of Exercise

Worldwide, most clinical guidelines recommend exercise as a first-line conservative treatment option in both hip and knee OA. However, there is not much evidence to help guide healthcare practitioners in deciding which type of exercises to use with their patients, Troels Kjeldsen, MSc, the principal investigator for the study, told this news organization.

“Neuromuscular exercise is a very commonly used exercise program in clinical practice, but, to our knowledge, it has never been compared to another type of exercise in hip OA,” observed Mr. Kjeldsen, who is a PhD student in the department of orthopedic surgery at Aarhus University Hospital, Aarhus, Denmark.

“Each year, many thousands of patients are referred to having neuromuscular exercise therapy with a physiotherapist,” Mr. Kjeldsen said. “So, we thought it would be worthwhile to compare it to PRT, another promising exercise type, to see if it really did perform as well as I think most people thought it did,” he added.
 

Comparing the Two Exercise Programs

PRT and NEMEX are two different types of exercise programs. PRT involves using resistance-training machines, and the focus is to maximize the exercise intensity by using as high an exercise load or weight as possible. By contrast, NEMEX consists of exercises that are low to moderate in intensity and emphasizes alignment, control, and stability of the movements.

To compare the two exercise strategies, Mr. Kjeldsen and fellow investigators recruited 160 participants at five hospitals and 10 physiotherapy clinics across three of five healthcare regions in Denmark.

For inclusion in the trial, the participants had to have a clinical diagnosis of hip OA, be older than 45 years, and experience pain during activity in one or both hips that was rated as 3 or higher on a 10-point numerical rating scale. Participants also had to have no or less than 30 minutes of hip joint stiffness in the morning as well as no surgery involving the lower extremities in the previous 6 months. 

Participants were then randomized to undertake the PRT (n = 82) or NEMEX (n = 78) program, delivered as two physiotherapist-led group sessions every week for 12 weeks. Exercise sessions were held at least 72 hours apart and consisted of a 10-minute warm-up on an exercise bike and then 50 minutes of PRT or NEMEX. PRT consisted of five generic resistance-based exercises targeting hip and knee joint muscles and NEMEX consisted of 10 exercises that increased in difficulty by varying the number, direction, speed, and surface of the movements performed.
 

 

 

Dead Heat Between PRT and NEMEX

The primary endpoint was the 30s-CST, which counted the number of times participants could stand from a seated position in 30 seconds. Participants in the PRT and NEMEX groups were able to do this maneuver a respective 11.3 and 11.6 times at baseline and 12.8 and 13.1 times after completion of the exercise programs. 

Other functional performance tests included a 40-m fast-paced walk, a nine-step timed stair climb, leg extensor power in the affected and unaffected limb, and a unilateral single repetition leg press. None of these showed a statistically significant benefit of PRT over NEMEX, or vice versa.

HOOS pain scores at baseline and 12 weeks for PRT were a respective 57.5 and 66.1, representing an overall 8.6-point increase, and for NEMEX they were 58.9 and 68.2, giving a 9.3-point increase, meaning there was only a -0.7 mean change when comparing the two groups.

Corresponding baseline and 12-week HOOS QOL scores for PRT were 43.7 and 51.7; for NEMEX, they were 47.1 and 52.8 thus giving 8.0- and 5.7-point increases and a 2.3 difference in change between the groups. Again, this wasn’t quite enough to show a clinically meaningful effect.
 

Future Steps

“The effect of exercise seems to be at its highest at 3-4 months when you implement exercise, so we compared the effects of the exercises at the time when they are probably going to be at their highest,” Mr. Kjeldsen explained. He said the research team also plans to look at what happens after 1 year of follow-up.

“The key take home message is that patients can be encouraged to pick the type of exercise that they find the most enjoyable, or the type that is available to them,” Mr. Kjeldsen suggested. 

Stephanie Chang, MD, MPH, who is the Deputy Editor of Annals of Internal Medicine and practices in Rockville, Maryland, commented on the paper to this news organization. “In this small study, we learned that exercises to strengthen lower extremity muscles did not improve pain or function any more than exercises for core stability and balance,” she said.

Dr. Chang pointed out that there was variance in the levels of activity that people already undertook at baseline: 40% of the PRT group and 41% of the NEMEX group already did 150 minutes or more of moderate intensity physical activity. 

“It’s possible that benefit or differences between interventions would be greater in people with different levels of baseline activity or even in those with different osteoarthritis severity,” she said. 

“In the meantime,” Dr. Chang added, “with the findings from this study, I would feel comfortable advising my patients with hip osteoarthritis to engage in whichever type of exercise they prefer — whether that exercise focuses on core strengthening and balance or on specific lower extremity muscle strengthening.”

The trial was funded by the Independent Research Fund Denmark, the Physiotherapy Practice Foundation, the Health Foundation, Aarhus University, Region Zealand, the Association of Danish Physiotherapists, Andelsfonden, and Hede Nielsens Family Foundation. Mr. Kjeldsen and Dr. Chang report no relevant financial relationships.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ANNALS OF INTERNAL MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Parotid and Labial Gland Biopsies Provide Similar Help in Diagnosing Sjögren Syndrome

Article Type
Changed
Fri, 04/12/2024 - 14:09

 

TOPLINE:

Both labial and parotid salivary glands can be used for the diagnosis of Sjögren syndrome (SjS), as their biopsies show largely similar histopathologic features in patients with sicca complaints suspected of having SjS.

METHODOLOGY:

  • While a labial gland biopsy is the conventional method for diagnosing SjS, a biopsy of the parotid gland is preferable, as it allows for repeat measurements and increases the possibility of finding a mucosa-associated lymphoid tissue lymphoma.
  • In this prospective study, researchers compared the focus score (FS) and other histopathologic features of SjS between paired labial and parotid salivary gland biopsies in a diagnostic cohort of patients with sicca suspected of having SjS.
  • Labial and parotid gland biopsies were simultaneously obtained under local infiltration anesthesia in 99 patients with oral and/or ocular sicca complaints at the University Medical Center Groningen, Groningen, the Netherlands, between 2014 and 2017.
  • FS is defined as the number of foci per 4 mm2 of salivary gland tissue. An FS ≥ 1 indicates a positive diagnosis of SjS.
  • On the basis of an expert opinion of three experienced rheumatologists, 36 patients were diagnosed with SjS, and 63 were diagnosed with non-SjS sicca.

TAKEAWAY:

  • The absolute agreement of various histopathologic features was high between labial and parotid biopsies, with values being 80% for FS, 89% for germinal centers, 84% for the immunoglobulin (Ig) A/IgG plasma cell shift, and 93% for pre-lymphoepithelial lesions.
  • However, an FS ≥ 1 was more frequently seen in labial glands than in parotid glands (P = .012) in both the SjS and non-SjS sicca populations, indicating that labial gland biopsies show more inflammation irrespective of the presence of SjS.
  • In patients with SjS, the absolute B-lymphocyte count, the number of germinal centers per mm2, and the severity of pre-lymphoepithelial lesions were higher in parotid glands than in labial glands, revealing evident histopathologic signs of B-lymphocyte hyperactivity.

IN PRACTICE:

“The results of this study offer novel insights into the pathophysiology of pSS [primary Sjögren syndrome] and can be incorporated into guidelines for the histopathological analysis of salivary gland biopsies,” the authors wrote.

SOURCE:

This study, led by Uzma Nakshbandi, MD, Department of Reumatology & Clinical Immunology, University Medical Center Groningen, Groningen, the Netherlands, was published online in Rheumatology (Oxford).

LIMITATIONS:

This study did not discuss any limitations.

DISCLOSURES:

This study was funded by the National Institutes of Health. One of the authors disclosed serving as a consultant and scientific advisory board member for several pharmaceutical companies, as well as receiving speaker’s fees from some of them.

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Both labial and parotid salivary glands can be used for the diagnosis of Sjögren syndrome (SjS), as their biopsies show largely similar histopathologic features in patients with sicca complaints suspected of having SjS.

METHODOLOGY:

  • While a labial gland biopsy is the conventional method for diagnosing SjS, a biopsy of the parotid gland is preferable, as it allows for repeat measurements and increases the possibility of finding a mucosa-associated lymphoid tissue lymphoma.
  • In this prospective study, researchers compared the focus score (FS) and other histopathologic features of SjS between paired labial and parotid salivary gland biopsies in a diagnostic cohort of patients with sicca suspected of having SjS.
  • Labial and parotid gland biopsies were simultaneously obtained under local infiltration anesthesia in 99 patients with oral and/or ocular sicca complaints at the University Medical Center Groningen, Groningen, the Netherlands, between 2014 and 2017.
  • FS is defined as the number of foci per 4 mm2 of salivary gland tissue. An FS ≥ 1 indicates a positive diagnosis of SjS.
  • On the basis of an expert opinion of three experienced rheumatologists, 36 patients were diagnosed with SjS, and 63 were diagnosed with non-SjS sicca.

TAKEAWAY:

  • The absolute agreement of various histopathologic features was high between labial and parotid biopsies, with values being 80% for FS, 89% for germinal centers, 84% for the immunoglobulin (Ig) A/IgG plasma cell shift, and 93% for pre-lymphoepithelial lesions.
  • However, an FS ≥ 1 was more frequently seen in labial glands than in parotid glands (P = .012) in both the SjS and non-SjS sicca populations, indicating that labial gland biopsies show more inflammation irrespective of the presence of SjS.
  • In patients with SjS, the absolute B-lymphocyte count, the number of germinal centers per mm2, and the severity of pre-lymphoepithelial lesions were higher in parotid glands than in labial glands, revealing evident histopathologic signs of B-lymphocyte hyperactivity.

IN PRACTICE:

“The results of this study offer novel insights into the pathophysiology of pSS [primary Sjögren syndrome] and can be incorporated into guidelines for the histopathological analysis of salivary gland biopsies,” the authors wrote.

SOURCE:

This study, led by Uzma Nakshbandi, MD, Department of Reumatology & Clinical Immunology, University Medical Center Groningen, Groningen, the Netherlands, was published online in Rheumatology (Oxford).

LIMITATIONS:

This study did not discuss any limitations.

DISCLOSURES:

This study was funded by the National Institutes of Health. One of the authors disclosed serving as a consultant and scientific advisory board member for several pharmaceutical companies, as well as receiving speaker’s fees from some of them.

A version of this article appeared on Medscape.com.

 

TOPLINE:

Both labial and parotid salivary glands can be used for the diagnosis of Sjögren syndrome (SjS), as their biopsies show largely similar histopathologic features in patients with sicca complaints suspected of having SjS.

METHODOLOGY:

  • While a labial gland biopsy is the conventional method for diagnosing SjS, a biopsy of the parotid gland is preferable, as it allows for repeat measurements and increases the possibility of finding a mucosa-associated lymphoid tissue lymphoma.
  • In this prospective study, researchers compared the focus score (FS) and other histopathologic features of SjS between paired labial and parotid salivary gland biopsies in a diagnostic cohort of patients with sicca suspected of having SjS.
  • Labial and parotid gland biopsies were simultaneously obtained under local infiltration anesthesia in 99 patients with oral and/or ocular sicca complaints at the University Medical Center Groningen, Groningen, the Netherlands, between 2014 and 2017.
  • FS is defined as the number of foci per 4 mm2 of salivary gland tissue. An FS ≥ 1 indicates a positive diagnosis of SjS.
  • On the basis of an expert opinion of three experienced rheumatologists, 36 patients were diagnosed with SjS, and 63 were diagnosed with non-SjS sicca.

TAKEAWAY:

  • The absolute agreement of various histopathologic features was high between labial and parotid biopsies, with values being 80% for FS, 89% for germinal centers, 84% for the immunoglobulin (Ig) A/IgG plasma cell shift, and 93% for pre-lymphoepithelial lesions.
  • However, an FS ≥ 1 was more frequently seen in labial glands than in parotid glands (P = .012) in both the SjS and non-SjS sicca populations, indicating that labial gland biopsies show more inflammation irrespective of the presence of SjS.
  • In patients with SjS, the absolute B-lymphocyte count, the number of germinal centers per mm2, and the severity of pre-lymphoepithelial lesions were higher in parotid glands than in labial glands, revealing evident histopathologic signs of B-lymphocyte hyperactivity.

IN PRACTICE:

“The results of this study offer novel insights into the pathophysiology of pSS [primary Sjögren syndrome] and can be incorporated into guidelines for the histopathological analysis of salivary gland biopsies,” the authors wrote.

SOURCE:

This study, led by Uzma Nakshbandi, MD, Department of Reumatology & Clinical Immunology, University Medical Center Groningen, Groningen, the Netherlands, was published online in Rheumatology (Oxford).

LIMITATIONS:

This study did not discuss any limitations.

DISCLOSURES:

This study was funded by the National Institutes of Health. One of the authors disclosed serving as a consultant and scientific advisory board member for several pharmaceutical companies, as well as receiving speaker’s fees from some of them.

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Speedy Eating and Late-Night Meals May Take a Toll on Health

Article Type
Changed
Fri, 04/19/2024 - 11:19

You are what you eat, as the adage goes. But a growing body of evidence indicates that it’s not just what and how much you eat that influence your health. How fast and when you eat also play a role.

Research now indicates that these two factors may affect the risk for gastrointestinal problems, obesity, and type 2 diabetes (T2D). Because meal timing and speed of consumption are modifiable, they present new opportunities to change patient behavior to help prevent and perhaps address these conditions.

Not So Fast

Most people are well acquainted with the short-term gastrointestinal effects of eating too quickly, which include indigestion, gas, bloating, and nausea. But regularly eating too fast can cause long-term consequences.

Obtaining a sense of fullness is key to staving off overeating and excess caloric intake. However, it takes approximately 20 minutes for the stomach to alert the brain to feelings of fullness. Eat too quickly and the fullness signaling might not set in until you’ve consumed more calories than intended. Research links this habit to excess body weight.

The practice also can lead to gastrointestinal diseases over the long term because overeating causes food to remain in the stomach longer, thus prolonging the time that the gastric mucosa is exposed to gastric acids.

A study of 10,893 adults in Korea reported that those with the fastest eating speed (< 5 min/meal) had a 1.7 times greater likelihood of endoscopic erosive gastritis than those with the slowest times (≥ 15 min/meal). Faster eating also was linked to increased risk for functional dyspepsia in a study involving 89 young-adult female military cadets in Korea with relatively controlled eating patterns.

On the extreme end of the spectrum, researchers who performed an assessment of a competitive speed eater speculated that the observed physiological accommodation required for the role (expanding the stomach to form a large flaccid sac) makes speed eaters vulnerable to morbid obesity, gastroparesis, intractable nausea and vomiting, and the need for gastrectomy.

The risk for metabolic changes and eventual development of T2D also appear to be linked to how quickly food is consumed.

Two clinical studies conducted in Japan — a cohort study of 2050 male factory workers and a nationwide study with 197,825 participants — identified a significant association between faster eating and T2D and insulin resistance. A case-control study involving 234 patients with new onset T2D and 468 controls from Lithuania linked faster eating to a greater than twofold risk for T2D. And a Chinese cross-sectional study of 7972 adults indicated that faster eating significantly increased the risk for metabolic syndrome, elevated blood pressure, and central obesity in adults.

Various hypotheses have been proposed to explain why fast eating may upset metabolic processes, including a delayed sense of fullness contributing to spiking postprandial glucose levels, lack of time for mastication causing higher glucose concentrations, and the triggering of specific cytokines (eg, interleukin-1 beta and interleukin-6) that lead to insulin resistance. It is also possible that the association is the result of people who eat quickly having relatively higher body weights, which translates to a higher risk for T2D.

However, there’s an opportunity in the association of rapid meal consumption with gastrointestinal and metabolic diseases, as people can slow the speed at which they eat so they feel full before they overeat.

A 2019 study in which 21 participants were instructed to eat a 600-kcal meal at a “normal” or “slow” pace (6 minutes or 24 minutes) found that the latter group reported feeling fuller while consuming fewer calories.

This approach may not work for all patients, however. There’s evidence to suggest that tactics to slow down eating may not limit the energy intake of those who are already overweight or obese.

Patients with obesity may physiologically differ in their processing of food, according to Michael Camilleri, MD, consultant in the Division of Gastroenterology and Hepatology at Mayo Clinic in Rochester, Minnesota.

“We have demonstrated that about 20%-25% of people with obesity actually have rapid gastric emptying,” he told this news organization. “As a result, they don’t feel full after they eat a meal and that might impact the total volume of food that they eat before they really feel full.”

 

 

The Ideal Time to Eat

It’s not only the speed at which individuals eat that may influence outcomes but when they take their meals. Research indicates that eating earlier in the day to align meals with the body’s circadian rhythms in metabolism offers health benefits.

“The focus would be to eat a meal that syncs during those daytime hours,” Collin Popp, PhD, MS, RD, a research scientist at the NYU Grossman School of Medicine in New York, told this news organization. “I typically suggest patients have their largest meal in the morning, whether that’s a large or medium-sized breakfast, or a big lunch.”

recent cross-sectional study of 2050 participants found that having the largest meal at lunch protected against obesity (odds ratio [OR], 0.71), whereas having it at dinner increased the risk for obesity (OR, 1.67) and led to higher body mass index.

Consuming the majority of calories in meals earlier in the day may have metabolic health benefits, as well.

2015 randomized controlled trial involving 18 adults with obesity and T2D found that eating a high-energy breakfast and a low-energy dinner leads to reduced hyperglycemia throughout the day compared with eating a low-energy breakfast and a high-energy dinner.

Time-restricted eating (TRE), a form of intermittent fasting, also can improve metabolic health depending on the time of day.

2023 meta-analysis found that TRE was more effective at reducing fasting glucose levels in participants who were overweight and obese if done earlier rather than later in the day. Similarly, a 2022 study involving 82 healthy patients without diabetes or obesity found that early TRE was more effective than mid-day TRE at improving insulin sensitivity and that it improved fasting glucose and reduced total body mass and adiposity, while mid-day TRE did not.

study that analyzed the effects of TRE in eight adult men with overweight and prediabetes found “better insulin resistance when the window of food consumption was earlier in the day,» noted endocrinologist Beverly Tchang, MD, an assistant professor of clinical medicine at Weill Cornell Medicine with a focus on obesity medication.

Patients May Benefit From Behavioral Interventions

Patients potentially negatively affected by eating too quickly or at late hours may benefit from adopting behavioral interventions to address these tendencies. To determine if a patient is a candidate for such interventions, Dr. Popp recommends starting with a simple conversation.

“When I first meet patients, I always ask them to describe to me a typical day for how they eat — when they’re eating, what they’re eating, the food quality, who are they with — to see if there’s social aspects to it. Then try and make the recommendations based on that,” said Dr. Popp, whose work focuses on biobehavioral interventions for the treatment and prevention of obesity, T2D, and other cardiometabolic outcomes.

Dr. Tchang said she encourages her patients to be mindful of hunger and fullness cues.

“Eat if you’re hungry; don’t force yourself to eat if you’re not hungry,” she said. “If you’re not sure whether you’re hungry or not, speak to a doctor because this points to an abnormality in your appetite-regulation system, which can be helped with GLP-1 [glucagon-like peptide 1] receptor agonists.”

Adjusting what patients eat can help them improve their meal timing.

“For example, we know that a high-fiber diet or a diet that has a large amount of fat in it tends to empty from the stomach slower,” Dr. Camilleri said. “That might give a sensation of fullness that lasts longer and that might prevent, for instance, the ingestion of the next meal.”

Those trying to eat more slowly are advised to seek out foods that are hard in texture and minimally processed.

study involving 50 patients with healthy weights found that hard foods are consumed more slowly than soft foods and that energy intake is lowest with hard, minimally processed foods. Combining hard-textured foods with explicit instructions to reduce eating speed has also been shown to be an effective strategy. For those inclined to seek out technology-based solution, evidence suggests that a self-monitoring wearable device can slow the eating rate.

Although the evidence is mounting that the timing and duration of meals have an impact on certain chronic diseases, clinicians should remember that these two factors are far from the most important contributors, Dr. Popp said.

“We also have to consider total caloric intake, food quality, sleep, alcohol use, smoking, and physical activity,” he said. “Meal timing should be considered as under the umbrella of health that is important for a lot of folks.”

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

You are what you eat, as the adage goes. But a growing body of evidence indicates that it’s not just what and how much you eat that influence your health. How fast and when you eat also play a role.

Research now indicates that these two factors may affect the risk for gastrointestinal problems, obesity, and type 2 diabetes (T2D). Because meal timing and speed of consumption are modifiable, they present new opportunities to change patient behavior to help prevent and perhaps address these conditions.

Not So Fast

Most people are well acquainted with the short-term gastrointestinal effects of eating too quickly, which include indigestion, gas, bloating, and nausea. But regularly eating too fast can cause long-term consequences.

Obtaining a sense of fullness is key to staving off overeating and excess caloric intake. However, it takes approximately 20 minutes for the stomach to alert the brain to feelings of fullness. Eat too quickly and the fullness signaling might not set in until you’ve consumed more calories than intended. Research links this habit to excess body weight.

The practice also can lead to gastrointestinal diseases over the long term because overeating causes food to remain in the stomach longer, thus prolonging the time that the gastric mucosa is exposed to gastric acids.

A study of 10,893 adults in Korea reported that those with the fastest eating speed (< 5 min/meal) had a 1.7 times greater likelihood of endoscopic erosive gastritis than those with the slowest times (≥ 15 min/meal). Faster eating also was linked to increased risk for functional dyspepsia in a study involving 89 young-adult female military cadets in Korea with relatively controlled eating patterns.

On the extreme end of the spectrum, researchers who performed an assessment of a competitive speed eater speculated that the observed physiological accommodation required for the role (expanding the stomach to form a large flaccid sac) makes speed eaters vulnerable to morbid obesity, gastroparesis, intractable nausea and vomiting, and the need for gastrectomy.

The risk for metabolic changes and eventual development of T2D also appear to be linked to how quickly food is consumed.

Two clinical studies conducted in Japan — a cohort study of 2050 male factory workers and a nationwide study with 197,825 participants — identified a significant association between faster eating and T2D and insulin resistance. A case-control study involving 234 patients with new onset T2D and 468 controls from Lithuania linked faster eating to a greater than twofold risk for T2D. And a Chinese cross-sectional study of 7972 adults indicated that faster eating significantly increased the risk for metabolic syndrome, elevated blood pressure, and central obesity in adults.

Various hypotheses have been proposed to explain why fast eating may upset metabolic processes, including a delayed sense of fullness contributing to spiking postprandial glucose levels, lack of time for mastication causing higher glucose concentrations, and the triggering of specific cytokines (eg, interleukin-1 beta and interleukin-6) that lead to insulin resistance. It is also possible that the association is the result of people who eat quickly having relatively higher body weights, which translates to a higher risk for T2D.

However, there’s an opportunity in the association of rapid meal consumption with gastrointestinal and metabolic diseases, as people can slow the speed at which they eat so they feel full before they overeat.

A 2019 study in which 21 participants were instructed to eat a 600-kcal meal at a “normal” or “slow” pace (6 minutes or 24 minutes) found that the latter group reported feeling fuller while consuming fewer calories.

This approach may not work for all patients, however. There’s evidence to suggest that tactics to slow down eating may not limit the energy intake of those who are already overweight or obese.

Patients with obesity may physiologically differ in their processing of food, according to Michael Camilleri, MD, consultant in the Division of Gastroenterology and Hepatology at Mayo Clinic in Rochester, Minnesota.

“We have demonstrated that about 20%-25% of people with obesity actually have rapid gastric emptying,” he told this news organization. “As a result, they don’t feel full after they eat a meal and that might impact the total volume of food that they eat before they really feel full.”

 

 

The Ideal Time to Eat

It’s not only the speed at which individuals eat that may influence outcomes but when they take their meals. Research indicates that eating earlier in the day to align meals with the body’s circadian rhythms in metabolism offers health benefits.

“The focus would be to eat a meal that syncs during those daytime hours,” Collin Popp, PhD, MS, RD, a research scientist at the NYU Grossman School of Medicine in New York, told this news organization. “I typically suggest patients have their largest meal in the morning, whether that’s a large or medium-sized breakfast, or a big lunch.”

recent cross-sectional study of 2050 participants found that having the largest meal at lunch protected against obesity (odds ratio [OR], 0.71), whereas having it at dinner increased the risk for obesity (OR, 1.67) and led to higher body mass index.

Consuming the majority of calories in meals earlier in the day may have metabolic health benefits, as well.

2015 randomized controlled trial involving 18 adults with obesity and T2D found that eating a high-energy breakfast and a low-energy dinner leads to reduced hyperglycemia throughout the day compared with eating a low-energy breakfast and a high-energy dinner.

Time-restricted eating (TRE), a form of intermittent fasting, also can improve metabolic health depending on the time of day.

2023 meta-analysis found that TRE was more effective at reducing fasting glucose levels in participants who were overweight and obese if done earlier rather than later in the day. Similarly, a 2022 study involving 82 healthy patients without diabetes or obesity found that early TRE was more effective than mid-day TRE at improving insulin sensitivity and that it improved fasting glucose and reduced total body mass and adiposity, while mid-day TRE did not.

study that analyzed the effects of TRE in eight adult men with overweight and prediabetes found “better insulin resistance when the window of food consumption was earlier in the day,» noted endocrinologist Beverly Tchang, MD, an assistant professor of clinical medicine at Weill Cornell Medicine with a focus on obesity medication.

Patients May Benefit From Behavioral Interventions

Patients potentially negatively affected by eating too quickly or at late hours may benefit from adopting behavioral interventions to address these tendencies. To determine if a patient is a candidate for such interventions, Dr. Popp recommends starting with a simple conversation.

“When I first meet patients, I always ask them to describe to me a typical day for how they eat — when they’re eating, what they’re eating, the food quality, who are they with — to see if there’s social aspects to it. Then try and make the recommendations based on that,” said Dr. Popp, whose work focuses on biobehavioral interventions for the treatment and prevention of obesity, T2D, and other cardiometabolic outcomes.

Dr. Tchang said she encourages her patients to be mindful of hunger and fullness cues.

“Eat if you’re hungry; don’t force yourself to eat if you’re not hungry,” she said. “If you’re not sure whether you’re hungry or not, speak to a doctor because this points to an abnormality in your appetite-regulation system, which can be helped with GLP-1 [glucagon-like peptide 1] receptor agonists.”

Adjusting what patients eat can help them improve their meal timing.

“For example, we know that a high-fiber diet or a diet that has a large amount of fat in it tends to empty from the stomach slower,” Dr. Camilleri said. “That might give a sensation of fullness that lasts longer and that might prevent, for instance, the ingestion of the next meal.”

Those trying to eat more slowly are advised to seek out foods that are hard in texture and minimally processed.

study involving 50 patients with healthy weights found that hard foods are consumed more slowly than soft foods and that energy intake is lowest with hard, minimally processed foods. Combining hard-textured foods with explicit instructions to reduce eating speed has also been shown to be an effective strategy. For those inclined to seek out technology-based solution, evidence suggests that a self-monitoring wearable device can slow the eating rate.

Although the evidence is mounting that the timing and duration of meals have an impact on certain chronic diseases, clinicians should remember that these two factors are far from the most important contributors, Dr. Popp said.

“We also have to consider total caloric intake, food quality, sleep, alcohol use, smoking, and physical activity,” he said. “Meal timing should be considered as under the umbrella of health that is important for a lot of folks.”

A version of this article appeared on Medscape.com.

You are what you eat, as the adage goes. But a growing body of evidence indicates that it’s not just what and how much you eat that influence your health. How fast and when you eat also play a role.

Research now indicates that these two factors may affect the risk for gastrointestinal problems, obesity, and type 2 diabetes (T2D). Because meal timing and speed of consumption are modifiable, they present new opportunities to change patient behavior to help prevent and perhaps address these conditions.

Not So Fast

Most people are well acquainted with the short-term gastrointestinal effects of eating too quickly, which include indigestion, gas, bloating, and nausea. But regularly eating too fast can cause long-term consequences.

Obtaining a sense of fullness is key to staving off overeating and excess caloric intake. However, it takes approximately 20 minutes for the stomach to alert the brain to feelings of fullness. Eat too quickly and the fullness signaling might not set in until you’ve consumed more calories than intended. Research links this habit to excess body weight.

The practice also can lead to gastrointestinal diseases over the long term because overeating causes food to remain in the stomach longer, thus prolonging the time that the gastric mucosa is exposed to gastric acids.

A study of 10,893 adults in Korea reported that those with the fastest eating speed (< 5 min/meal) had a 1.7 times greater likelihood of endoscopic erosive gastritis than those with the slowest times (≥ 15 min/meal). Faster eating also was linked to increased risk for functional dyspepsia in a study involving 89 young-adult female military cadets in Korea with relatively controlled eating patterns.

On the extreme end of the spectrum, researchers who performed an assessment of a competitive speed eater speculated that the observed physiological accommodation required for the role (expanding the stomach to form a large flaccid sac) makes speed eaters vulnerable to morbid obesity, gastroparesis, intractable nausea and vomiting, and the need for gastrectomy.

The risk for metabolic changes and eventual development of T2D also appear to be linked to how quickly food is consumed.

Two clinical studies conducted in Japan — a cohort study of 2050 male factory workers and a nationwide study with 197,825 participants — identified a significant association between faster eating and T2D and insulin resistance. A case-control study involving 234 patients with new onset T2D and 468 controls from Lithuania linked faster eating to a greater than twofold risk for T2D. And a Chinese cross-sectional study of 7972 adults indicated that faster eating significantly increased the risk for metabolic syndrome, elevated blood pressure, and central obesity in adults.

Various hypotheses have been proposed to explain why fast eating may upset metabolic processes, including a delayed sense of fullness contributing to spiking postprandial glucose levels, lack of time for mastication causing higher glucose concentrations, and the triggering of specific cytokines (eg, interleukin-1 beta and interleukin-6) that lead to insulin resistance. It is also possible that the association is the result of people who eat quickly having relatively higher body weights, which translates to a higher risk for T2D.

However, there’s an opportunity in the association of rapid meal consumption with gastrointestinal and metabolic diseases, as people can slow the speed at which they eat so they feel full before they overeat.

A 2019 study in which 21 participants were instructed to eat a 600-kcal meal at a “normal” or “slow” pace (6 minutes or 24 minutes) found that the latter group reported feeling fuller while consuming fewer calories.

This approach may not work for all patients, however. There’s evidence to suggest that tactics to slow down eating may not limit the energy intake of those who are already overweight or obese.

Patients with obesity may physiologically differ in their processing of food, according to Michael Camilleri, MD, consultant in the Division of Gastroenterology and Hepatology at Mayo Clinic in Rochester, Minnesota.

“We have demonstrated that about 20%-25% of people with obesity actually have rapid gastric emptying,” he told this news organization. “As a result, they don’t feel full after they eat a meal and that might impact the total volume of food that they eat before they really feel full.”

 

 

The Ideal Time to Eat

It’s not only the speed at which individuals eat that may influence outcomes but when they take their meals. Research indicates that eating earlier in the day to align meals with the body’s circadian rhythms in metabolism offers health benefits.

“The focus would be to eat a meal that syncs during those daytime hours,” Collin Popp, PhD, MS, RD, a research scientist at the NYU Grossman School of Medicine in New York, told this news organization. “I typically suggest patients have their largest meal in the morning, whether that’s a large or medium-sized breakfast, or a big lunch.”

recent cross-sectional study of 2050 participants found that having the largest meal at lunch protected against obesity (odds ratio [OR], 0.71), whereas having it at dinner increased the risk for obesity (OR, 1.67) and led to higher body mass index.

Consuming the majority of calories in meals earlier in the day may have metabolic health benefits, as well.

2015 randomized controlled trial involving 18 adults with obesity and T2D found that eating a high-energy breakfast and a low-energy dinner leads to reduced hyperglycemia throughout the day compared with eating a low-energy breakfast and a high-energy dinner.

Time-restricted eating (TRE), a form of intermittent fasting, also can improve metabolic health depending on the time of day.

2023 meta-analysis found that TRE was more effective at reducing fasting glucose levels in participants who were overweight and obese if done earlier rather than later in the day. Similarly, a 2022 study involving 82 healthy patients without diabetes or obesity found that early TRE was more effective than mid-day TRE at improving insulin sensitivity and that it improved fasting glucose and reduced total body mass and adiposity, while mid-day TRE did not.

study that analyzed the effects of TRE in eight adult men with overweight and prediabetes found “better insulin resistance when the window of food consumption was earlier in the day,» noted endocrinologist Beverly Tchang, MD, an assistant professor of clinical medicine at Weill Cornell Medicine with a focus on obesity medication.

Patients May Benefit From Behavioral Interventions

Patients potentially negatively affected by eating too quickly or at late hours may benefit from adopting behavioral interventions to address these tendencies. To determine if a patient is a candidate for such interventions, Dr. Popp recommends starting with a simple conversation.

“When I first meet patients, I always ask them to describe to me a typical day for how they eat — when they’re eating, what they’re eating, the food quality, who are they with — to see if there’s social aspects to it. Then try and make the recommendations based on that,” said Dr. Popp, whose work focuses on biobehavioral interventions for the treatment and prevention of obesity, T2D, and other cardiometabolic outcomes.

Dr. Tchang said she encourages her patients to be mindful of hunger and fullness cues.

“Eat if you’re hungry; don’t force yourself to eat if you’re not hungry,” she said. “If you’re not sure whether you’re hungry or not, speak to a doctor because this points to an abnormality in your appetite-regulation system, which can be helped with GLP-1 [glucagon-like peptide 1] receptor agonists.”

Adjusting what patients eat can help them improve their meal timing.

“For example, we know that a high-fiber diet or a diet that has a large amount of fat in it tends to empty from the stomach slower,” Dr. Camilleri said. “That might give a sensation of fullness that lasts longer and that might prevent, for instance, the ingestion of the next meal.”

Those trying to eat more slowly are advised to seek out foods that are hard in texture and minimally processed.

study involving 50 patients with healthy weights found that hard foods are consumed more slowly than soft foods and that energy intake is lowest with hard, minimally processed foods. Combining hard-textured foods with explicit instructions to reduce eating speed has also been shown to be an effective strategy. For those inclined to seek out technology-based solution, evidence suggests that a self-monitoring wearable device can slow the eating rate.

Although the evidence is mounting that the timing and duration of meals have an impact on certain chronic diseases, clinicians should remember that these two factors are far from the most important contributors, Dr. Popp said.

“We also have to consider total caloric intake, food quality, sleep, alcohol use, smoking, and physical activity,” he said. “Meal timing should be considered as under the umbrella of health that is important for a lot of folks.”

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article