User login
Digital Pathology Seminar Focuses on Federal Practice
Recognizing the increasing importance of digital pathology and its potential impact to transform federal health care, government, military, and university digital pathology specialists convened in May 2023 to share expertise to advance the use of digital pathology in federal health care.
The seminar was hosted by the University of Pittsburgh and led by Ronald Poropatich, MD, Director of the Center for Military Medicine Research, Health Sciences, and Professor of Medicine at the University of Pittsburgh Medical Center, and Douglas Hartman, MD, Vice Chair of Pathology Informatics, Associate Director of the Center for AI Innovation in Medical Imaging, and Professor of Pathology at the University of Pittsburgh/University of Pittsburgh Medical Center (UPMC).
Invitees included senior federal government pathologists, laboratory scientists, IT leaders, and stakeholders from the VA, DoD, HHS (NIH, CDC, IHS, FDA) and other federal agencies. The speakers for the conference were CDR Roger Boodoo, MD, Chief of Innovation, Defense Health Agency; Ryan Collins, MD, Pathologist, Williamsport Pathology Association; Pat Flanders, Chief Information Officer, J6, Defense Health Agency; Matthew Hanna, MD, Director, Digital Pathology Informatics, Memorial Sloan Kettering Cancer Center; Stephanie Harmon, PhD, Staff Scientist, NIH NCI, Imaging/Data Scientist in Molecular Imaging; Douglas Hartman, MD, Vice Chair of Pathology Informatics, University of Pittsburgh; Stephen Hewitt, MD, PhD, Head, Experimental Pathology Laboratory, NIH NCI, Center for Cancer Research; Jason Hipp, MD, PhD, Chief Digital Innovation Officer, Mayo Collaborate Services, Mayo Clinic; Brian Lein, MD, Assistant Director, Healthcare Administration, Defense Health Agency; Col Mark Lyman, MD, Pathology Consultant to the US Air Force Surgeon General; COL Joel Moncur, MD, Director, Joint Pathology Center; Ronald Poropatich, MD, Director of the Center for Military Medicine Research, Health Sciences; Professor of Medicine, University of Pittsburgh; David Shulkin, MD, Ninth U.S. Secretary of Veterans Affairs; Eliot Siegel, MD, Chief of Radiology and Nuclear Medicine, Veterans Affairs Maryland Healthcare System; Professor and Vice Chair, University of Maryland School of Medicine; CDR Jenny Smith, DO, Pathologist, US Naval Medical Center Portsmouth; Shandong Wu, PhD, Associate Professor, Departments of Radiology, Biomedical Informatics, and Bioengineering, Director of Center for Artificial Intelligence Innovation in Medical Imaging, University of Pittsburgh; LCDR Victoria Mahar MD, Pathologist, US Army.
Throughout the 1.5-day meeting, topics such as the integration of systems, the value of single vendor solutions vs multiple vendors, and the interconnectedness of radiology and pathology in health care were discussed. The speakers addressed the challenges of adopting digital pathology, including workflow improvement, quality control, and the generalizability of algorithms. The importance of collaboration, leadership, data analytics, compliance with clinical practice guidelines, and research and development efforts were stressed. The increasingly important role of artificial intelligence (AI) in digital pathology, its applications, and its benefits were also highlighted. Continuing education credits were offered to participants.
Overall, the meeting provided valuable insights into the advancements, challenges, and potential of digital pathology, AI, and technology integration in the federal health care ecosystem. However, this cannot be achieved without leadership from and close collaboration between key industry, academic, and government stakeholders.
Uses of Digital Pathology
Digital pathology refers to the practice of digitizing glass slides containing tissue samples and using digital imaging technology to analyze and interpret them. It involves capturing high-resolution images of microscopic slides and storing them in a digital format. These digitized images can be accessed and analyzed using computer-based tools and software.
While traditional pathology involves examining tissue samples under a microscope to make diagnoses and provide insights into diseases and conditions, digital pathology uses digital scanners that capture all relevant tissue on the glass slide at high magnification. This process generates a high-fidelity digital representation of the tissue sample that can be navigated akin to how glass slides are reviewed on a brightfield microscope in current practice (eg, panning, zooming, etc). Microscopic review of patient specimens in pathology allows for identifying patterns and markers that may not be easily detectable with manual examination alone.
The digitized slides can be stored in a database or a slide management system, allowing pathologists and other healthcare professionals to access and review them remotely, thus creating the potential to improve collaboration among pathologists, facilitate second opinions, and enable easier access to archived slides for research purposes.
Potential Benefits
Digital pathology also opens the door to advanced image analysis techniques, such as computer-aided diagnosis, machine learning, and AI algorithms, with the potential for the following outcomes and benefits:
- Improved accuracy AI algorithms can analyze large volumes of digital pathology data with great precision, reducing the chances of human error and subjective interpretation. This can lead to more accurate and consistent diagnoses, especially in challenging cases where subtle patterns or features may be difficult to detect.
- Automated detection and classification AI algorithms can be trained to detect and classify specific features or abnormalities in digital pathology images. For example, AI models can identify cancerous cells, tissue patterns associated with different diseases, or specific biomarkers. This can assist pathologists in diagnosing diseases more accurately and efficiently.
- Quantitative analysis AI can analyze large quantities of digital pathology data and extract quantitative measurements. For instance, it can calculate the percentage of tumor cells in a sample, assess the density of immune cells, or measure the extent of tissue damage. These objective measurements can aid in prognosis prediction and treatment planning.
- Image segmentation AI algorithms can segment digital pathology images into different regions or structures, such as nuclei, cytoplasm, or blood vessels. This segmentation allows for precise analysis and extraction of features for further study. It can also facilitate the identification of specific cell types or tissue components.
- Image enhancement AI techniques can enhance the quality of digital pathology images by improving clarity and reducing noise or artifacts. This can help pathologists visualize and interpret slides more effectively, especially in challenging cases with low-quality or complex images.
- Decision support systems AI-powered decision support systems can assist pathologists by providing recommendations or second opinions based on the analysis of digital pathology data. These systems can offer insights, suggest potential diagnoses, or provide relevant research references, augmenting the pathologist’s expertise and improving diagnostic accuracy.
- Collaboration and second opinions Digital pathology, combined with AI, facilitates remote access to digitized slides, enabling pathologists to seek second opinions or collaborate with experts from around the world. This can enhance the quality of diagnoses by leveraging the collective expertise of pathologists and fostering knowledge sharing.
- Education and training AI algorithms can be utilized in virtual microscopy platforms to create interactive and educational experiences. Pathology residents and students can learn from annotated cases, receive real-time feedback, and develop their skills in a digital environment.
- Research and discovery AI can assist in identifying patterns, correlations, and novel biomarkers in digital pathology data. By analyzing large datasets, AI algorithms can help uncover new insights, contribute to research advancements, and aid in the development of personalized medicine approaches.
- Predictive modeling AI can analyze vast amounts of digital pathology data, patient records, and outcomes to develop predictive models. These models can estimate disease progression, treatment response, or patient survival rates based on various factors. They can contribute to personalized medicine by assisting in treatment decisions and prognosis assessment.
It is important to note that while AI has shown promising results, it is not intended to replace human pathologists but to augment their capabilities. Overall, the combination of AI technology with the expertise of pathologists can lead to improved diagnosis, better patient care, and more efficient workflows in digital pathology.
Recognizing the increasing importance of digital pathology and its potential impact to transform federal health care, government, military, and university digital pathology specialists convened in May 2023 to share expertise to advance the use of digital pathology in federal health care.
The seminar was hosted by the University of Pittsburgh and led by Ronald Poropatich, MD, Director of the Center for Military Medicine Research, Health Sciences, and Professor of Medicine at the University of Pittsburgh Medical Center, and Douglas Hartman, MD, Vice Chair of Pathology Informatics, Associate Director of the Center for AI Innovation in Medical Imaging, and Professor of Pathology at the University of Pittsburgh/University of Pittsburgh Medical Center (UPMC).
Invitees included senior federal government pathologists, laboratory scientists, IT leaders, and stakeholders from the VA, DoD, HHS (NIH, CDC, IHS, FDA) and other federal agencies. The speakers for the conference were CDR Roger Boodoo, MD, Chief of Innovation, Defense Health Agency; Ryan Collins, MD, Pathologist, Williamsport Pathology Association; Pat Flanders, Chief Information Officer, J6, Defense Health Agency; Matthew Hanna, MD, Director, Digital Pathology Informatics, Memorial Sloan Kettering Cancer Center; Stephanie Harmon, PhD, Staff Scientist, NIH NCI, Imaging/Data Scientist in Molecular Imaging; Douglas Hartman, MD, Vice Chair of Pathology Informatics, University of Pittsburgh; Stephen Hewitt, MD, PhD, Head, Experimental Pathology Laboratory, NIH NCI, Center for Cancer Research; Jason Hipp, MD, PhD, Chief Digital Innovation Officer, Mayo Collaborate Services, Mayo Clinic; Brian Lein, MD, Assistant Director, Healthcare Administration, Defense Health Agency; Col Mark Lyman, MD, Pathology Consultant to the US Air Force Surgeon General; COL Joel Moncur, MD, Director, Joint Pathology Center; Ronald Poropatich, MD, Director of the Center for Military Medicine Research, Health Sciences; Professor of Medicine, University of Pittsburgh; David Shulkin, MD, Ninth U.S. Secretary of Veterans Affairs; Eliot Siegel, MD, Chief of Radiology and Nuclear Medicine, Veterans Affairs Maryland Healthcare System; Professor and Vice Chair, University of Maryland School of Medicine; CDR Jenny Smith, DO, Pathologist, US Naval Medical Center Portsmouth; Shandong Wu, PhD, Associate Professor, Departments of Radiology, Biomedical Informatics, and Bioengineering, Director of Center for Artificial Intelligence Innovation in Medical Imaging, University of Pittsburgh; LCDR Victoria Mahar MD, Pathologist, US Army.
Throughout the 1.5-day meeting, topics such as the integration of systems, the value of single vendor solutions vs multiple vendors, and the interconnectedness of radiology and pathology in health care were discussed. The speakers addressed the challenges of adopting digital pathology, including workflow improvement, quality control, and the generalizability of algorithms. The importance of collaboration, leadership, data analytics, compliance with clinical practice guidelines, and research and development efforts were stressed. The increasingly important role of artificial intelligence (AI) in digital pathology, its applications, and its benefits were also highlighted. Continuing education credits were offered to participants.
Overall, the meeting provided valuable insights into the advancements, challenges, and potential of digital pathology, AI, and technology integration in the federal health care ecosystem. However, this cannot be achieved without leadership from and close collaboration between key industry, academic, and government stakeholders.
Uses of Digital Pathology
Digital pathology refers to the practice of digitizing glass slides containing tissue samples and using digital imaging technology to analyze and interpret them. It involves capturing high-resolution images of microscopic slides and storing them in a digital format. These digitized images can be accessed and analyzed using computer-based tools and software.
While traditional pathology involves examining tissue samples under a microscope to make diagnoses and provide insights into diseases and conditions, digital pathology uses digital scanners that capture all relevant tissue on the glass slide at high magnification. This process generates a high-fidelity digital representation of the tissue sample that can be navigated akin to how glass slides are reviewed on a brightfield microscope in current practice (eg, panning, zooming, etc). Microscopic review of patient specimens in pathology allows for identifying patterns and markers that may not be easily detectable with manual examination alone.
The digitized slides can be stored in a database or a slide management system, allowing pathologists and other healthcare professionals to access and review them remotely, thus creating the potential to improve collaboration among pathologists, facilitate second opinions, and enable easier access to archived slides for research purposes.
Potential Benefits
Digital pathology also opens the door to advanced image analysis techniques, such as computer-aided diagnosis, machine learning, and AI algorithms, with the potential for the following outcomes and benefits:
- Improved accuracy AI algorithms can analyze large volumes of digital pathology data with great precision, reducing the chances of human error and subjective interpretation. This can lead to more accurate and consistent diagnoses, especially in challenging cases where subtle patterns or features may be difficult to detect.
- Automated detection and classification AI algorithms can be trained to detect and classify specific features or abnormalities in digital pathology images. For example, AI models can identify cancerous cells, tissue patterns associated with different diseases, or specific biomarkers. This can assist pathologists in diagnosing diseases more accurately and efficiently.
- Quantitative analysis AI can analyze large quantities of digital pathology data and extract quantitative measurements. For instance, it can calculate the percentage of tumor cells in a sample, assess the density of immune cells, or measure the extent of tissue damage. These objective measurements can aid in prognosis prediction and treatment planning.
- Image segmentation AI algorithms can segment digital pathology images into different regions or structures, such as nuclei, cytoplasm, or blood vessels. This segmentation allows for precise analysis and extraction of features for further study. It can also facilitate the identification of specific cell types or tissue components.
- Image enhancement AI techniques can enhance the quality of digital pathology images by improving clarity and reducing noise or artifacts. This can help pathologists visualize and interpret slides more effectively, especially in challenging cases with low-quality or complex images.
- Decision support systems AI-powered decision support systems can assist pathologists by providing recommendations or second opinions based on the analysis of digital pathology data. These systems can offer insights, suggest potential diagnoses, or provide relevant research references, augmenting the pathologist’s expertise and improving diagnostic accuracy.
- Collaboration and second opinions Digital pathology, combined with AI, facilitates remote access to digitized slides, enabling pathologists to seek second opinions or collaborate with experts from around the world. This can enhance the quality of diagnoses by leveraging the collective expertise of pathologists and fostering knowledge sharing.
- Education and training AI algorithms can be utilized in virtual microscopy platforms to create interactive and educational experiences. Pathology residents and students can learn from annotated cases, receive real-time feedback, and develop their skills in a digital environment.
- Research and discovery AI can assist in identifying patterns, correlations, and novel biomarkers in digital pathology data. By analyzing large datasets, AI algorithms can help uncover new insights, contribute to research advancements, and aid in the development of personalized medicine approaches.
- Predictive modeling AI can analyze vast amounts of digital pathology data, patient records, and outcomes to develop predictive models. These models can estimate disease progression, treatment response, or patient survival rates based on various factors. They can contribute to personalized medicine by assisting in treatment decisions and prognosis assessment.
It is important to note that while AI has shown promising results, it is not intended to replace human pathologists but to augment their capabilities. Overall, the combination of AI technology with the expertise of pathologists can lead to improved diagnosis, better patient care, and more efficient workflows in digital pathology.
Recognizing the increasing importance of digital pathology and its potential impact to transform federal health care, government, military, and university digital pathology specialists convened in May 2023 to share expertise to advance the use of digital pathology in federal health care.
The seminar was hosted by the University of Pittsburgh and led by Ronald Poropatich, MD, Director of the Center for Military Medicine Research, Health Sciences, and Professor of Medicine at the University of Pittsburgh Medical Center, and Douglas Hartman, MD, Vice Chair of Pathology Informatics, Associate Director of the Center for AI Innovation in Medical Imaging, and Professor of Pathology at the University of Pittsburgh/University of Pittsburgh Medical Center (UPMC).
Invitees included senior federal government pathologists, laboratory scientists, IT leaders, and stakeholders from the VA, DoD, HHS (NIH, CDC, IHS, FDA) and other federal agencies. The speakers for the conference were CDR Roger Boodoo, MD, Chief of Innovation, Defense Health Agency; Ryan Collins, MD, Pathologist, Williamsport Pathology Association; Pat Flanders, Chief Information Officer, J6, Defense Health Agency; Matthew Hanna, MD, Director, Digital Pathology Informatics, Memorial Sloan Kettering Cancer Center; Stephanie Harmon, PhD, Staff Scientist, NIH NCI, Imaging/Data Scientist in Molecular Imaging; Douglas Hartman, MD, Vice Chair of Pathology Informatics, University of Pittsburgh; Stephen Hewitt, MD, PhD, Head, Experimental Pathology Laboratory, NIH NCI, Center for Cancer Research; Jason Hipp, MD, PhD, Chief Digital Innovation Officer, Mayo Collaborate Services, Mayo Clinic; Brian Lein, MD, Assistant Director, Healthcare Administration, Defense Health Agency; Col Mark Lyman, MD, Pathology Consultant to the US Air Force Surgeon General; COL Joel Moncur, MD, Director, Joint Pathology Center; Ronald Poropatich, MD, Director of the Center for Military Medicine Research, Health Sciences; Professor of Medicine, University of Pittsburgh; David Shulkin, MD, Ninth U.S. Secretary of Veterans Affairs; Eliot Siegel, MD, Chief of Radiology and Nuclear Medicine, Veterans Affairs Maryland Healthcare System; Professor and Vice Chair, University of Maryland School of Medicine; CDR Jenny Smith, DO, Pathologist, US Naval Medical Center Portsmouth; Shandong Wu, PhD, Associate Professor, Departments of Radiology, Biomedical Informatics, and Bioengineering, Director of Center for Artificial Intelligence Innovation in Medical Imaging, University of Pittsburgh; LCDR Victoria Mahar MD, Pathologist, US Army.
Throughout the 1.5-day meeting, topics such as the integration of systems, the value of single vendor solutions vs multiple vendors, and the interconnectedness of radiology and pathology in health care were discussed. The speakers addressed the challenges of adopting digital pathology, including workflow improvement, quality control, and the generalizability of algorithms. The importance of collaboration, leadership, data analytics, compliance with clinical practice guidelines, and research and development efforts were stressed. The increasingly important role of artificial intelligence (AI) in digital pathology, its applications, and its benefits were also highlighted. Continuing education credits were offered to participants.
Overall, the meeting provided valuable insights into the advancements, challenges, and potential of digital pathology, AI, and technology integration in the federal health care ecosystem. However, this cannot be achieved without leadership from and close collaboration between key industry, academic, and government stakeholders.
Uses of Digital Pathology
Digital pathology refers to the practice of digitizing glass slides containing tissue samples and using digital imaging technology to analyze and interpret them. It involves capturing high-resolution images of microscopic slides and storing them in a digital format. These digitized images can be accessed and analyzed using computer-based tools and software.
While traditional pathology involves examining tissue samples under a microscope to make diagnoses and provide insights into diseases and conditions, digital pathology uses digital scanners that capture all relevant tissue on the glass slide at high magnification. This process generates a high-fidelity digital representation of the tissue sample that can be navigated akin to how glass slides are reviewed on a brightfield microscope in current practice (eg, panning, zooming, etc). Microscopic review of patient specimens in pathology allows for identifying patterns and markers that may not be easily detectable with manual examination alone.
The digitized slides can be stored in a database or a slide management system, allowing pathologists and other healthcare professionals to access and review them remotely, thus creating the potential to improve collaboration among pathologists, facilitate second opinions, and enable easier access to archived slides for research purposes.
Potential Benefits
Digital pathology also opens the door to advanced image analysis techniques, such as computer-aided diagnosis, machine learning, and AI algorithms, with the potential for the following outcomes and benefits:
- Improved accuracy AI algorithms can analyze large volumes of digital pathology data with great precision, reducing the chances of human error and subjective interpretation. This can lead to more accurate and consistent diagnoses, especially in challenging cases where subtle patterns or features may be difficult to detect.
- Automated detection and classification AI algorithms can be trained to detect and classify specific features or abnormalities in digital pathology images. For example, AI models can identify cancerous cells, tissue patterns associated with different diseases, or specific biomarkers. This can assist pathologists in diagnosing diseases more accurately and efficiently.
- Quantitative analysis AI can analyze large quantities of digital pathology data and extract quantitative measurements. For instance, it can calculate the percentage of tumor cells in a sample, assess the density of immune cells, or measure the extent of tissue damage. These objective measurements can aid in prognosis prediction and treatment planning.
- Image segmentation AI algorithms can segment digital pathology images into different regions or structures, such as nuclei, cytoplasm, or blood vessels. This segmentation allows for precise analysis and extraction of features for further study. It can also facilitate the identification of specific cell types or tissue components.
- Image enhancement AI techniques can enhance the quality of digital pathology images by improving clarity and reducing noise or artifacts. This can help pathologists visualize and interpret slides more effectively, especially in challenging cases with low-quality or complex images.
- Decision support systems AI-powered decision support systems can assist pathologists by providing recommendations or second opinions based on the analysis of digital pathology data. These systems can offer insights, suggest potential diagnoses, or provide relevant research references, augmenting the pathologist’s expertise and improving diagnostic accuracy.
- Collaboration and second opinions Digital pathology, combined with AI, facilitates remote access to digitized slides, enabling pathologists to seek second opinions or collaborate with experts from around the world. This can enhance the quality of diagnoses by leveraging the collective expertise of pathologists and fostering knowledge sharing.
- Education and training AI algorithms can be utilized in virtual microscopy platforms to create interactive and educational experiences. Pathology residents and students can learn from annotated cases, receive real-time feedback, and develop their skills in a digital environment.
- Research and discovery AI can assist in identifying patterns, correlations, and novel biomarkers in digital pathology data. By analyzing large datasets, AI algorithms can help uncover new insights, contribute to research advancements, and aid in the development of personalized medicine approaches.
- Predictive modeling AI can analyze vast amounts of digital pathology data, patient records, and outcomes to develop predictive models. These models can estimate disease progression, treatment response, or patient survival rates based on various factors. They can contribute to personalized medicine by assisting in treatment decisions and prognosis assessment.
It is important to note that while AI has shown promising results, it is not intended to replace human pathologists but to augment their capabilities. Overall, the combination of AI technology with the expertise of pathologists can lead to improved diagnosis, better patient care, and more efficient workflows in digital pathology.
Two Diets Linked to Improved Cognition, Slowed Brain Aging
An intermittent fasting (IF) diet and a standard healthy living (HL) diet focused on healthy foods both lead to weight loss, reduced insulin resistance (IR), and slowed brain aging in older overweight adults with IR, new research showed. However, neither diet has an effect on Alzheimer’s disease (AD) biomarkers.
Although investigators found both diets were beneficial, some outcomes were more robust with the IF diet.
“The study provides a blueprint for assessing brain effects of dietary interventions and motivates further research on intermittent fasting and continuous diets for brain health optimization,” wrote the investigators, led by Dimitrios Kapogiannis, MD, chief, human neuroscience section, National Institute on Aging, and adjunct associate professor of neurology, the Johns Hopkins University School of Medicine.
The findings were published online in Cell Metabolism.
Cognitive Outcomes
The prevalence of IR — reduced cellular sensitivity to insulin that’s a hallmark of type 2 diabetes — increases with age and obesity, adding to an increased risk for accelerated brain aging as well as AD and related dementias (ADRD) in older adults who have overweight.
Studies reported healthy diets promote overall health, but it’s unclear whether, and to what extent, they improve brain health beyond general health enhancement.
Researchers used multiple brain and cognitive measures to assess dietary effects on brain health, including peripherally harvested neuron-derived extracellular vesicles (NDEVs) to probe neuronal insulin signaling; MRI to investigate the pace of brain aging; magnetic resonance spectroscopy (MRS) to measure brain glucose, metabolites, and neurotransmitters; and NDEVs and cerebrospinal fluid to derive biomarkers for AD/ADRD.
The study included 40 cognitively intact overweight participants with IR, mean age 63.2 years, 60% women, and 62.5% White. Their mean body weight was 97.1 kg and mean body mass index (BMI) was 34.4.
Participants were randomly assigned to 8 weeks of an IF diet or a HL diet that emphasizes fruits, vegetables, whole grains, lean proteins, and low-fat dairy and limits added sugars, saturated fats, and sodium.
The IF diet involved following the HL diet for 5 days per week and restricting calories to a quarter of the recommended daily intake for 2 consecutive days.
Both diets reduced neuronal IR and had comparable effects in improving insulin signaling biomarkers in NDEVs, reducing brain glucose on MRS, and improving blood biomarkers of carbohydrate and lipid metabolism.
Using MRI, researchers also assessed brain age, an indication of whether the brain appears older or younger than an individual’s chronological age. There was a decrease of 2.63 years with the IF diet (P = .05) and 2.42 years with the HL diet (P < .001) in the anterior cingulate and ventromedial prefrontal cortex.
Both diets improved executive function and memory, with those following the IF diet benefiting more in strategic planning, switching between two cognitively demanding tasks, cued recall, and other areas.
Hypothesis-Generating Research
AD biomarkers including amyloid beta 42 (Aß42), Aß40, and plasma phosphorylated-tau181 did not change with either diet, a finding that investigators speculated may be due to the short duration of the study. Light-chain neurofilaments increased across groups with no differences between the diets.
In other findings, BMI decreased by 1.41 with the IF diet and by 0.80 with the HL diet, and a similar pattern was observed for weight. Waist circumference decreased in both groups with no significant differences between diets.
An exploratory analysis showed executive function improved with the IF diet but not with the HL diet in women, whereas it improved with both diets in men. BMI and apolipoprotein E and SLC16A7 genotypes also modulated diet effects.
Both diets were well tolerated. The most frequent adverse events were gastrointestinal and occurred only with the IF diet.
The authors noted the findings are preliminary and results are hypothesis generating. Study limitations included the study’s short duration and its power to detect anything other than large to moderate effect size changes and differences between the diets. Researchers also didn’t acquire data on dietary intake, so lapses in adherence can’t be excluded. However, the large decreases in BMI, weight, and waist circumference with both diets indicated high adherence.
The study was supported by the National Institutes of Health’s National Institute on Aging. The authors reported no competing interests.
A version of this article first appeared on Medscape.com.
An intermittent fasting (IF) diet and a standard healthy living (HL) diet focused on healthy foods both lead to weight loss, reduced insulin resistance (IR), and slowed brain aging in older overweight adults with IR, new research showed. However, neither diet has an effect on Alzheimer’s disease (AD) biomarkers.
Although investigators found both diets were beneficial, some outcomes were more robust with the IF diet.
“The study provides a blueprint for assessing brain effects of dietary interventions and motivates further research on intermittent fasting and continuous diets for brain health optimization,” wrote the investigators, led by Dimitrios Kapogiannis, MD, chief, human neuroscience section, National Institute on Aging, and adjunct associate professor of neurology, the Johns Hopkins University School of Medicine.
The findings were published online in Cell Metabolism.
Cognitive Outcomes
The prevalence of IR — reduced cellular sensitivity to insulin that’s a hallmark of type 2 diabetes — increases with age and obesity, adding to an increased risk for accelerated brain aging as well as AD and related dementias (ADRD) in older adults who have overweight.
Studies reported healthy diets promote overall health, but it’s unclear whether, and to what extent, they improve brain health beyond general health enhancement.
Researchers used multiple brain and cognitive measures to assess dietary effects on brain health, including peripherally harvested neuron-derived extracellular vesicles (NDEVs) to probe neuronal insulin signaling; MRI to investigate the pace of brain aging; magnetic resonance spectroscopy (MRS) to measure brain glucose, metabolites, and neurotransmitters; and NDEVs and cerebrospinal fluid to derive biomarkers for AD/ADRD.
The study included 40 cognitively intact overweight participants with IR, mean age 63.2 years, 60% women, and 62.5% White. Their mean body weight was 97.1 kg and mean body mass index (BMI) was 34.4.
Participants were randomly assigned to 8 weeks of an IF diet or a HL diet that emphasizes fruits, vegetables, whole grains, lean proteins, and low-fat dairy and limits added sugars, saturated fats, and sodium.
The IF diet involved following the HL diet for 5 days per week and restricting calories to a quarter of the recommended daily intake for 2 consecutive days.
Both diets reduced neuronal IR and had comparable effects in improving insulin signaling biomarkers in NDEVs, reducing brain glucose on MRS, and improving blood biomarkers of carbohydrate and lipid metabolism.
Using MRI, researchers also assessed brain age, an indication of whether the brain appears older or younger than an individual’s chronological age. There was a decrease of 2.63 years with the IF diet (P = .05) and 2.42 years with the HL diet (P < .001) in the anterior cingulate and ventromedial prefrontal cortex.
Both diets improved executive function and memory, with those following the IF diet benefiting more in strategic planning, switching between two cognitively demanding tasks, cued recall, and other areas.
Hypothesis-Generating Research
AD biomarkers including amyloid beta 42 (Aß42), Aß40, and plasma phosphorylated-tau181 did not change with either diet, a finding that investigators speculated may be due to the short duration of the study. Light-chain neurofilaments increased across groups with no differences between the diets.
In other findings, BMI decreased by 1.41 with the IF diet and by 0.80 with the HL diet, and a similar pattern was observed for weight. Waist circumference decreased in both groups with no significant differences between diets.
An exploratory analysis showed executive function improved with the IF diet but not with the HL diet in women, whereas it improved with both diets in men. BMI and apolipoprotein E and SLC16A7 genotypes also modulated diet effects.
Both diets were well tolerated. The most frequent adverse events were gastrointestinal and occurred only with the IF diet.
The authors noted the findings are preliminary and results are hypothesis generating. Study limitations included the study’s short duration and its power to detect anything other than large to moderate effect size changes and differences between the diets. Researchers also didn’t acquire data on dietary intake, so lapses in adherence can’t be excluded. However, the large decreases in BMI, weight, and waist circumference with both diets indicated high adherence.
The study was supported by the National Institutes of Health’s National Institute on Aging. The authors reported no competing interests.
A version of this article first appeared on Medscape.com.
An intermittent fasting (IF) diet and a standard healthy living (HL) diet focused on healthy foods both lead to weight loss, reduced insulin resistance (IR), and slowed brain aging in older overweight adults with IR, new research showed. However, neither diet has an effect on Alzheimer’s disease (AD) biomarkers.
Although investigators found both diets were beneficial, some outcomes were more robust with the IF diet.
“The study provides a blueprint for assessing brain effects of dietary interventions and motivates further research on intermittent fasting and continuous diets for brain health optimization,” wrote the investigators, led by Dimitrios Kapogiannis, MD, chief, human neuroscience section, National Institute on Aging, and adjunct associate professor of neurology, the Johns Hopkins University School of Medicine.
The findings were published online in Cell Metabolism.
Cognitive Outcomes
The prevalence of IR — reduced cellular sensitivity to insulin that’s a hallmark of type 2 diabetes — increases with age and obesity, adding to an increased risk for accelerated brain aging as well as AD and related dementias (ADRD) in older adults who have overweight.
Studies reported healthy diets promote overall health, but it’s unclear whether, and to what extent, they improve brain health beyond general health enhancement.
Researchers used multiple brain and cognitive measures to assess dietary effects on brain health, including peripherally harvested neuron-derived extracellular vesicles (NDEVs) to probe neuronal insulin signaling; MRI to investigate the pace of brain aging; magnetic resonance spectroscopy (MRS) to measure brain glucose, metabolites, and neurotransmitters; and NDEVs and cerebrospinal fluid to derive biomarkers for AD/ADRD.
The study included 40 cognitively intact overweight participants with IR, mean age 63.2 years, 60% women, and 62.5% White. Their mean body weight was 97.1 kg and mean body mass index (BMI) was 34.4.
Participants were randomly assigned to 8 weeks of an IF diet or a HL diet that emphasizes fruits, vegetables, whole grains, lean proteins, and low-fat dairy and limits added sugars, saturated fats, and sodium.
The IF diet involved following the HL diet for 5 days per week and restricting calories to a quarter of the recommended daily intake for 2 consecutive days.
Both diets reduced neuronal IR and had comparable effects in improving insulin signaling biomarkers in NDEVs, reducing brain glucose on MRS, and improving blood biomarkers of carbohydrate and lipid metabolism.
Using MRI, researchers also assessed brain age, an indication of whether the brain appears older or younger than an individual’s chronological age. There was a decrease of 2.63 years with the IF diet (P = .05) and 2.42 years with the HL diet (P < .001) in the anterior cingulate and ventromedial prefrontal cortex.
Both diets improved executive function and memory, with those following the IF diet benefiting more in strategic planning, switching between two cognitively demanding tasks, cued recall, and other areas.
Hypothesis-Generating Research
AD biomarkers including amyloid beta 42 (Aß42), Aß40, and plasma phosphorylated-tau181 did not change with either diet, a finding that investigators speculated may be due to the short duration of the study. Light-chain neurofilaments increased across groups with no differences between the diets.
In other findings, BMI decreased by 1.41 with the IF diet and by 0.80 with the HL diet, and a similar pattern was observed for weight. Waist circumference decreased in both groups with no significant differences between diets.
An exploratory analysis showed executive function improved with the IF diet but not with the HL diet in women, whereas it improved with both diets in men. BMI and apolipoprotein E and SLC16A7 genotypes also modulated diet effects.
Both diets were well tolerated. The most frequent adverse events were gastrointestinal and occurred only with the IF diet.
The authors noted the findings are preliminary and results are hypothesis generating. Study limitations included the study’s short duration and its power to detect anything other than large to moderate effect size changes and differences between the diets. Researchers also didn’t acquire data on dietary intake, so lapses in adherence can’t be excluded. However, the large decreases in BMI, weight, and waist circumference with both diets indicated high adherence.
The study was supported by the National Institutes of Health’s National Institute on Aging. The authors reported no competing interests.
A version of this article first appeared on Medscape.com.
FROM CELL METABOLISM
Heat Waves: A Silent Threat to Older Adults’ Kidneys
TOPLINE:
Older adults show an increase in creatinine and cystatin C levels after exposure to extreme heat in a dry setting despite staying hydrated; however, changes in these kidney function biomarkers are much more modest in a humid setting and in young adults.
METHODOLOGY:
- Older adults are vulnerable to heat-related morbidity and mortality, with kidney complications accounting for many excess hospital admissions during heat waves.
- Researchers investigated plasma-based markers of kidney function following extreme heat exposure for 3 hours in 20 young (21-39 years) and 18 older (65-76 years) adults recruited from the Dallas-Fort Worth area.
- All participants underwent heat exposure in a chamber at 47 °C (116 °F) and 15% relative humidity (dry setting) and 41 °C (105 °F) and 40% relative humidity (humid setting) on separate days. They performed light physical activity mimicking their daily tasks and drank 3 mL/kg body mass of water every hour while exposed to heat.
- Blood samples were collected at baseline, immediately before the end of heat exposure (end-heating), and 2 hours after heat exposure.
- Plasma creatinine was the primary outcome, with a change ≥ 0.3 mg/dL considered as clinically meaningful. Cystatin C was the secondary outcome.
TAKEAWAY:
- The plasma creatinine level showed a modest increase from baseline to end-heating (difference, 0.10 mg/dL; P = .004) and at 2 hours post exposure (difference, 0.17 mg/dL; P < .001) in older adults facing heat exposure in the dry setting.
- The mean cystatin C levels also increased from baseline to end-heating by 0.29 mg/L (P = .01) and at 2 hours post heat exposure by 0.28 mg/L (P = .004) in older adults in the dry setting.
- The mean creatinine levels increased by only 0.06 mg/dL (P = .01) from baseline to 2 hours post exposure in older adults facing heat exposure in the humid setting.
- Young adults didn’t show any significant change in the plasma cystatin C levels during or after heat exposure; however, there was a modest increase in the plasma creatinine levels after 2 hours of heat exposure (difference, 0.06; P = .004).
IN PRACTICE:
“These findings provide limited evidence that the heightened thermal strain in older adults during extreme heat may contribute to reduced kidney function,” the authors wrote.
SOURCE:
The study was led by Zachary J. McKenna, PhD, from the Department of Internal Medicine, University of Texas Southwestern Medical Center, Dallas, Texas, and was published online in JAMA.
LIMITATIONS:
The use of plasma-based markers of kidney function, a short laboratory-based exposure, and a small number of generally healthy participants were the main limitations that could affect the generalizability of this study’s findings to broader populations and real-world settings.
DISCLOSURES:
The National Institutes of Health and American Heart Association funded this study. Two authors declared receiving grants and nonfinancial support from several sources.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
Older adults show an increase in creatinine and cystatin C levels after exposure to extreme heat in a dry setting despite staying hydrated; however, changes in these kidney function biomarkers are much more modest in a humid setting and in young adults.
METHODOLOGY:
- Older adults are vulnerable to heat-related morbidity and mortality, with kidney complications accounting for many excess hospital admissions during heat waves.
- Researchers investigated plasma-based markers of kidney function following extreme heat exposure for 3 hours in 20 young (21-39 years) and 18 older (65-76 years) adults recruited from the Dallas-Fort Worth area.
- All participants underwent heat exposure in a chamber at 47 °C (116 °F) and 15% relative humidity (dry setting) and 41 °C (105 °F) and 40% relative humidity (humid setting) on separate days. They performed light physical activity mimicking their daily tasks and drank 3 mL/kg body mass of water every hour while exposed to heat.
- Blood samples were collected at baseline, immediately before the end of heat exposure (end-heating), and 2 hours after heat exposure.
- Plasma creatinine was the primary outcome, with a change ≥ 0.3 mg/dL considered as clinically meaningful. Cystatin C was the secondary outcome.
TAKEAWAY:
- The plasma creatinine level showed a modest increase from baseline to end-heating (difference, 0.10 mg/dL; P = .004) and at 2 hours post exposure (difference, 0.17 mg/dL; P < .001) in older adults facing heat exposure in the dry setting.
- The mean cystatin C levels also increased from baseline to end-heating by 0.29 mg/L (P = .01) and at 2 hours post heat exposure by 0.28 mg/L (P = .004) in older adults in the dry setting.
- The mean creatinine levels increased by only 0.06 mg/dL (P = .01) from baseline to 2 hours post exposure in older adults facing heat exposure in the humid setting.
- Young adults didn’t show any significant change in the plasma cystatin C levels during or after heat exposure; however, there was a modest increase in the plasma creatinine levels after 2 hours of heat exposure (difference, 0.06; P = .004).
IN PRACTICE:
“These findings provide limited evidence that the heightened thermal strain in older adults during extreme heat may contribute to reduced kidney function,” the authors wrote.
SOURCE:
The study was led by Zachary J. McKenna, PhD, from the Department of Internal Medicine, University of Texas Southwestern Medical Center, Dallas, Texas, and was published online in JAMA.
LIMITATIONS:
The use of plasma-based markers of kidney function, a short laboratory-based exposure, and a small number of generally healthy participants were the main limitations that could affect the generalizability of this study’s findings to broader populations and real-world settings.
DISCLOSURES:
The National Institutes of Health and American Heart Association funded this study. Two authors declared receiving grants and nonfinancial support from several sources.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
Older adults show an increase in creatinine and cystatin C levels after exposure to extreme heat in a dry setting despite staying hydrated; however, changes in these kidney function biomarkers are much more modest in a humid setting and in young adults.
METHODOLOGY:
- Older adults are vulnerable to heat-related morbidity and mortality, with kidney complications accounting for many excess hospital admissions during heat waves.
- Researchers investigated plasma-based markers of kidney function following extreme heat exposure for 3 hours in 20 young (21-39 years) and 18 older (65-76 years) adults recruited from the Dallas-Fort Worth area.
- All participants underwent heat exposure in a chamber at 47 °C (116 °F) and 15% relative humidity (dry setting) and 41 °C (105 °F) and 40% relative humidity (humid setting) on separate days. They performed light physical activity mimicking their daily tasks and drank 3 mL/kg body mass of water every hour while exposed to heat.
- Blood samples were collected at baseline, immediately before the end of heat exposure (end-heating), and 2 hours after heat exposure.
- Plasma creatinine was the primary outcome, with a change ≥ 0.3 mg/dL considered as clinically meaningful. Cystatin C was the secondary outcome.
TAKEAWAY:
- The plasma creatinine level showed a modest increase from baseline to end-heating (difference, 0.10 mg/dL; P = .004) and at 2 hours post exposure (difference, 0.17 mg/dL; P < .001) in older adults facing heat exposure in the dry setting.
- The mean cystatin C levels also increased from baseline to end-heating by 0.29 mg/L (P = .01) and at 2 hours post heat exposure by 0.28 mg/L (P = .004) in older adults in the dry setting.
- The mean creatinine levels increased by only 0.06 mg/dL (P = .01) from baseline to 2 hours post exposure in older adults facing heat exposure in the humid setting.
- Young adults didn’t show any significant change in the plasma cystatin C levels during or after heat exposure; however, there was a modest increase in the plasma creatinine levels after 2 hours of heat exposure (difference, 0.06; P = .004).
IN PRACTICE:
“These findings provide limited evidence that the heightened thermal strain in older adults during extreme heat may contribute to reduced kidney function,” the authors wrote.
SOURCE:
The study was led by Zachary J. McKenna, PhD, from the Department of Internal Medicine, University of Texas Southwestern Medical Center, Dallas, Texas, and was published online in JAMA.
LIMITATIONS:
The use of plasma-based markers of kidney function, a short laboratory-based exposure, and a small number of generally healthy participants were the main limitations that could affect the generalizability of this study’s findings to broader populations and real-world settings.
DISCLOSURES:
The National Institutes of Health and American Heart Association funded this study. Two authors declared receiving grants and nonfinancial support from several sources.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
Risk Stratification May Work Well for FIT-Based CRC Screening in Elderly
WASHINGTON — , according to a study presented at the annual Digestive Disease Week® (DDW).
In particular, interval CRC risk can vary substantially based on the fecal hemoglobin (f-Hb) concentration in the patient’s last fecal immunochemical test (FIT), as well as the number of prior screening rounds.
“Less is known about what happens after the upper age limit has been reached and individuals are not invited to participate in more screening rounds. This is important as life expectancy is increasing, and it is increasingly important to consider the most efficient way of screening the elderly,” said lead author Brenda van Stigt, a PhD candidate focused on cancer screening at Erasmus University Medical Center in Rotterdam, the Netherlands.
In the Netherlands, adults between ages 55 and 75 are invited to participate in stool-based CRC screening every 2 years. Based on a fecal immunochemical testing (FIT) threshold of 47 μg Hb/g, those who test positive are referred to colonoscopy, and those who test negative are invited to participate again after a 2-year period.
FIT can play a major role in risk stratification, Ms. van Stigt noted, along with other factors that influence CRC risk, such as age, sex, and CRC screening history. Although this is documented for ages 55-75, she and colleagues wanted to know more about what happens after age 75.
Ms. Van Stigt and colleagues conducted a population-based study by analyzing Dutch national cancer registry data and FIT results around the final screening at age 75, looking at those who were diagnosed with CRC within 24 months of their last negative FIT. The researchers assessed interval CRC risk and cancer stage, accounting for sex, last f-Hb concentration, and the number of screening rounds.
Among 305,761 people with a complete 24-month follow-up after a negative FIT, 661 patients were diagnosed with interval CRC, indicating an overall interval CRC risk of 21.6 per 10,000 individuals with a negative FIT. There were no significant differences by sex.
However, there were differences by screening rounds, with those who had participated in three or four screening rounds having a lower risk than those who participated only once (HR, .49).
In addition, those with detectable f-Hb (>0 μg Hb/g) in their last screening round had a much higher interval CRC risk (HR, 4.87), at 65.8 per 10,000 negative FITs, compared with 13.8 per 10,000 among those without detectable f-Hb. Interval CRC risk also increased over time for those with detectable f-Hb.
About 15% of the total population had detectable f-Hb, whereas 46% of those with interval CRC had detectable f-Hb, Ms. van Stigt said, meaning that nearly half of patients who were diagnosed with interval CRC already had detectable f-Hb in their prior FIT.
In a survival analysis, there was no association between interval CRC risk and sex. However, those who participated in three or four screening rounds were half as likely to be diagnosed than those who participated once or twice, and those with detectable f-Hb were five times as likely to be diagnosed.
For late-stage CRC, there was no association with sex or the number of screening rounds. Detectable f-Hb was associated with not only a higher risk of interval CRC but also a late-stage diagnosis.
“These findings indicate that one uniform age to stop screening is suboptimal,” Ms. van Stigt said. “Personalized screening strategies should, therefore, also ideally incorporate a risk-stratified age to stop screening.”
The US Preventive Services Task Force recommends that clinicians personalize screening for ages 76-85, accounting for overall health, prior screening history, and patient preferences.
“But we have no clear guidance on how to quantify or weigh these factors. This interesting study highlights how one of these factors (prior screening history) and fecal hemoglobin level (an emerging factor) are powerful stratifiers of subsequent colorectal cancer risk,” said Sameer D. Saini, MD, AGAF, director and research investigator at the VA Ann Arbor Healthcare System’s Center for Clinical Management Research. Dr. Saini wasn’t involved with the study.
At the clinical level, Dr. Saini said, sophisticated modeling is needed to understand the interaction with competing risks and identify the optimal screening strategies for patients at varying levels of cancer risk and life expectancy. Models could also help to quantify the population benefits and cost-effectiveness of personalized screening.
“Finally, it is important to note that, in many health systems, access to quantitative FIT may be limited,” he said. “These data may be less informative if colonoscopy is the primary mode of screening.”
Ms. van Stigt and Dr. Saini reported no relevant disclosures.
WASHINGTON — , according to a study presented at the annual Digestive Disease Week® (DDW).
In particular, interval CRC risk can vary substantially based on the fecal hemoglobin (f-Hb) concentration in the patient’s last fecal immunochemical test (FIT), as well as the number of prior screening rounds.
“Less is known about what happens after the upper age limit has been reached and individuals are not invited to participate in more screening rounds. This is important as life expectancy is increasing, and it is increasingly important to consider the most efficient way of screening the elderly,” said lead author Brenda van Stigt, a PhD candidate focused on cancer screening at Erasmus University Medical Center in Rotterdam, the Netherlands.
In the Netherlands, adults between ages 55 and 75 are invited to participate in stool-based CRC screening every 2 years. Based on a fecal immunochemical testing (FIT) threshold of 47 μg Hb/g, those who test positive are referred to colonoscopy, and those who test negative are invited to participate again after a 2-year period.
FIT can play a major role in risk stratification, Ms. van Stigt noted, along with other factors that influence CRC risk, such as age, sex, and CRC screening history. Although this is documented for ages 55-75, she and colleagues wanted to know more about what happens after age 75.
Ms. Van Stigt and colleagues conducted a population-based study by analyzing Dutch national cancer registry data and FIT results around the final screening at age 75, looking at those who were diagnosed with CRC within 24 months of their last negative FIT. The researchers assessed interval CRC risk and cancer stage, accounting for sex, last f-Hb concentration, and the number of screening rounds.
Among 305,761 people with a complete 24-month follow-up after a negative FIT, 661 patients were diagnosed with interval CRC, indicating an overall interval CRC risk of 21.6 per 10,000 individuals with a negative FIT. There were no significant differences by sex.
However, there were differences by screening rounds, with those who had participated in three or four screening rounds having a lower risk than those who participated only once (HR, .49).
In addition, those with detectable f-Hb (>0 μg Hb/g) in their last screening round had a much higher interval CRC risk (HR, 4.87), at 65.8 per 10,000 negative FITs, compared with 13.8 per 10,000 among those without detectable f-Hb. Interval CRC risk also increased over time for those with detectable f-Hb.
About 15% of the total population had detectable f-Hb, whereas 46% of those with interval CRC had detectable f-Hb, Ms. van Stigt said, meaning that nearly half of patients who were diagnosed with interval CRC already had detectable f-Hb in their prior FIT.
In a survival analysis, there was no association between interval CRC risk and sex. However, those who participated in three or four screening rounds were half as likely to be diagnosed than those who participated once or twice, and those with detectable f-Hb were five times as likely to be diagnosed.
For late-stage CRC, there was no association with sex or the number of screening rounds. Detectable f-Hb was associated with not only a higher risk of interval CRC but also a late-stage diagnosis.
“These findings indicate that one uniform age to stop screening is suboptimal,” Ms. van Stigt said. “Personalized screening strategies should, therefore, also ideally incorporate a risk-stratified age to stop screening.”
The US Preventive Services Task Force recommends that clinicians personalize screening for ages 76-85, accounting for overall health, prior screening history, and patient preferences.
“But we have no clear guidance on how to quantify or weigh these factors. This interesting study highlights how one of these factors (prior screening history) and fecal hemoglobin level (an emerging factor) are powerful stratifiers of subsequent colorectal cancer risk,” said Sameer D. Saini, MD, AGAF, director and research investigator at the VA Ann Arbor Healthcare System’s Center for Clinical Management Research. Dr. Saini wasn’t involved with the study.
At the clinical level, Dr. Saini said, sophisticated modeling is needed to understand the interaction with competing risks and identify the optimal screening strategies for patients at varying levels of cancer risk and life expectancy. Models could also help to quantify the population benefits and cost-effectiveness of personalized screening.
“Finally, it is important to note that, in many health systems, access to quantitative FIT may be limited,” he said. “These data may be less informative if colonoscopy is the primary mode of screening.”
Ms. van Stigt and Dr. Saini reported no relevant disclosures.
WASHINGTON — , according to a study presented at the annual Digestive Disease Week® (DDW).
In particular, interval CRC risk can vary substantially based on the fecal hemoglobin (f-Hb) concentration in the patient’s last fecal immunochemical test (FIT), as well as the number of prior screening rounds.
“Less is known about what happens after the upper age limit has been reached and individuals are not invited to participate in more screening rounds. This is important as life expectancy is increasing, and it is increasingly important to consider the most efficient way of screening the elderly,” said lead author Brenda van Stigt, a PhD candidate focused on cancer screening at Erasmus University Medical Center in Rotterdam, the Netherlands.
In the Netherlands, adults between ages 55 and 75 are invited to participate in stool-based CRC screening every 2 years. Based on a fecal immunochemical testing (FIT) threshold of 47 μg Hb/g, those who test positive are referred to colonoscopy, and those who test negative are invited to participate again after a 2-year period.
FIT can play a major role in risk stratification, Ms. van Stigt noted, along with other factors that influence CRC risk, such as age, sex, and CRC screening history. Although this is documented for ages 55-75, she and colleagues wanted to know more about what happens after age 75.
Ms. Van Stigt and colleagues conducted a population-based study by analyzing Dutch national cancer registry data and FIT results around the final screening at age 75, looking at those who were diagnosed with CRC within 24 months of their last negative FIT. The researchers assessed interval CRC risk and cancer stage, accounting for sex, last f-Hb concentration, and the number of screening rounds.
Among 305,761 people with a complete 24-month follow-up after a negative FIT, 661 patients were diagnosed with interval CRC, indicating an overall interval CRC risk of 21.6 per 10,000 individuals with a negative FIT. There were no significant differences by sex.
However, there were differences by screening rounds, with those who had participated in three or four screening rounds having a lower risk than those who participated only once (HR, .49).
In addition, those with detectable f-Hb (>0 μg Hb/g) in their last screening round had a much higher interval CRC risk (HR, 4.87), at 65.8 per 10,000 negative FITs, compared with 13.8 per 10,000 among those without detectable f-Hb. Interval CRC risk also increased over time for those with detectable f-Hb.
About 15% of the total population had detectable f-Hb, whereas 46% of those with interval CRC had detectable f-Hb, Ms. van Stigt said, meaning that nearly half of patients who were diagnosed with interval CRC already had detectable f-Hb in their prior FIT.
In a survival analysis, there was no association between interval CRC risk and sex. However, those who participated in three or four screening rounds were half as likely to be diagnosed than those who participated once or twice, and those with detectable f-Hb were five times as likely to be diagnosed.
For late-stage CRC, there was no association with sex or the number of screening rounds. Detectable f-Hb was associated with not only a higher risk of interval CRC but also a late-stage diagnosis.
“These findings indicate that one uniform age to stop screening is suboptimal,” Ms. van Stigt said. “Personalized screening strategies should, therefore, also ideally incorporate a risk-stratified age to stop screening.”
The US Preventive Services Task Force recommends that clinicians personalize screening for ages 76-85, accounting for overall health, prior screening history, and patient preferences.
“But we have no clear guidance on how to quantify or weigh these factors. This interesting study highlights how one of these factors (prior screening history) and fecal hemoglobin level (an emerging factor) are powerful stratifiers of subsequent colorectal cancer risk,” said Sameer D. Saini, MD, AGAF, director and research investigator at the VA Ann Arbor Healthcare System’s Center for Clinical Management Research. Dr. Saini wasn’t involved with the study.
At the clinical level, Dr. Saini said, sophisticated modeling is needed to understand the interaction with competing risks and identify the optimal screening strategies for patients at varying levels of cancer risk and life expectancy. Models could also help to quantify the population benefits and cost-effectiveness of personalized screening.
“Finally, it is important to note that, in many health systems, access to quantitative FIT may be limited,” he said. “These data may be less informative if colonoscopy is the primary mode of screening.”
Ms. van Stigt and Dr. Saini reported no relevant disclosures.
FROM DDW 2024
Statins, Vitamin D, and Exercise in Older Adults
In this article, I will review several recently published articles and guidelines relevant to the care of older adults in primary care. The articles of interest address statins for primary prevention, vitamin D supplementation and testing, and physical activity for healthy aging.
Statins for Primary Prevention of Cardiovascular Disease
A common conundrum in primary care is whether an older adult should be on a statin for primary prevention. This question has been difficult to answer because of the underrepresentation of older adults in clinical trials that examine the effect of statins for primary prevention. A recent study by Xu et al. published in Annals of Internal Medicine sought to address this gap in knowledge, investigating the risks and benefits of statins for primary prevention for older adults.1
This study stratified participants by “old” (aged 75-84 years) and “very old” (85 years or older). In this study, older adults who had an indication for statins were initiated on statins and studied over a 5-year period and compared with age-matched cohorts not initiated on statin therapy. Participants with known cardiovascular disease at baseline were excluded. The outcomes of interest were major cardiovascular disease (CVD) (a composite of myocardial infarction, stroke, or heart failure), all-cause mortality, and adverse effect of drug therapy (myopathy or liver dysfunction).
The study found that among older adults aged 75-84, initiation of statin therapy led to a 1.2% risk reduction in major CVD over a 5-year period. For older adults aged 85 and greater, initiation of statins had an even larger impact, leading to a 4.4% risk reduction in major CVD over a 5-year period. The study found that there was no significant difference in adverse effects including myopathy or liver dysfunction in both age groups.
Statins, the study suggests, are appropriate and safe to initiate for primary prevention in older adults and can lead to substantial benefits in reduction of CVD. While time to benefit was not explicitly examined in this study, a prior study by Yourman et al. suggested that the time to benefit for statins for primary prevention in adults aged 50-75 would be least 2.5 years.2
My takeaway from these findings is to discuss statin initiation for primary prevention for older patients who are focused on longevity, have good functional status (often used in geriatrics as a proxy for prognosis), and are willing to accept more medications.
Empiric Vitamin D Supplementation in Adults over 75 Years
Vitamin D is one of the most common supplements taken by older adults but evidence supporting vitamin D supplementation is variable in published literature, as most data comes from observational trials. New guidelines from the Endocrine Society focused on developing recommendations for healthy individuals with data obtained from randomized controlled trials (RCTs) and large longitudinal observational trials with comparison groups if RCTs were not available. These guidelines recommend against empiric supplementation of vitamin D for healthy adults aged 18-74, excluding pregnant women and patients with high-risk diabetes.3
For older adults aged 75 or greater, empiric vitamin D supplementation is recommended because of the possible reduction of risk in all-cause mortality in this population. Of note, this was a grade 2 recommendation by the panel, indicating that the benefits of the treatment probably outweigh the risks. The panel stated that vitamin D supplementation could be delivered through fortified foods, multivitamins with vitamin D, or as a separate vitamin D supplement.
The dosage should remain within the recommended daily allowance outlined by the Institute of Medicine of 800 IU daily for adults over 70, and the panel recommends low-dose daily vitamin D supplementation over high-dose interval supplementation. The panel noted that routine screening of vitamin D levels should not be used to guide decision-making on whether to start supplementation, but vitamin D levels should be obtained for patients who have an indication for evaluation.
The reviewers highlight that these guidelines were developed for healthy individuals and are not applicable to those with conditions that warrant vitamin D evaluation. In my clinical practice, many of my patients have bone-mineral conditions and cognitive impairment that warrant evaluation. Based on these guidelines, I will consider empiric vitamin D supplementation more often for healthy patients aged 75 and older.
Sedentary Behaviors and Healthy Aging
Engaging inactive older adults in regular physical activity can be challenging, particularly as the pandemic has led to more pervasive social isolation and affected the availability of in-person exercise activities in the community. Physical activity is a key component of healthy aging and cognition, and its benefits should be a part of routine counseling for older adults.
An interesting recent study published in JAMA Network Open by Shi et al. evaluated the association of health behaviors and aging in female US nurses over a 20-year period.4 Surveys were administered to capture time spent in each behavior, such as being sedentary (TV watching, sitting at home or at work), light activity (walking around the house or at work), and moderate to vigorous activity (walking for exercise, lawn mowing). “Healthy aging” was defined by the absence of chronic conditions such as heart failure, and lack of physical, mental, and cognitive impairment.
The study found that participants who were more sedentary were less likely to age healthfully, with each additional 2 hours of TV watching per day associated with a 12% reduction in likelihood of healthy aging. Light physical activity was associated with a significant increase in healthy aging, with a 6% increase in the likelihood of healthy aging for each additional 2 hours of light activity. Each additional 1 hour of moderate to vigorous activity was associated with a 14% increase in the likelihood of healthy aging. These findings support discussions with patients that behavior change, even in small increments, can be beneficial in healthy aging.
References
1. Xu W et al. Ann Intern Med. 2024 Jun;177(6):701-10.
2. Yourman LC et al. JAMA Intern Med. 2021;181:179-85.
3. Demay MB et al. J Clin Endocrinol Metab. August 2024;109(8):1907-47.
4. Shi H et al. JAMA Netw Open. 2024;7(6):e2416300.
In this article, I will review several recently published articles and guidelines relevant to the care of older adults in primary care. The articles of interest address statins for primary prevention, vitamin D supplementation and testing, and physical activity for healthy aging.
Statins for Primary Prevention of Cardiovascular Disease
A common conundrum in primary care is whether an older adult should be on a statin for primary prevention. This question has been difficult to answer because of the underrepresentation of older adults in clinical trials that examine the effect of statins for primary prevention. A recent study by Xu et al. published in Annals of Internal Medicine sought to address this gap in knowledge, investigating the risks and benefits of statins for primary prevention for older adults.1
This study stratified participants by “old” (aged 75-84 years) and “very old” (85 years or older). In this study, older adults who had an indication for statins were initiated on statins and studied over a 5-year period and compared with age-matched cohorts not initiated on statin therapy. Participants with known cardiovascular disease at baseline were excluded. The outcomes of interest were major cardiovascular disease (CVD) (a composite of myocardial infarction, stroke, or heart failure), all-cause mortality, and adverse effect of drug therapy (myopathy or liver dysfunction).
The study found that among older adults aged 75-84, initiation of statin therapy led to a 1.2% risk reduction in major CVD over a 5-year period. For older adults aged 85 and greater, initiation of statins had an even larger impact, leading to a 4.4% risk reduction in major CVD over a 5-year period. The study found that there was no significant difference in adverse effects including myopathy or liver dysfunction in both age groups.
Statins, the study suggests, are appropriate and safe to initiate for primary prevention in older adults and can lead to substantial benefits in reduction of CVD. While time to benefit was not explicitly examined in this study, a prior study by Yourman et al. suggested that the time to benefit for statins for primary prevention in adults aged 50-75 would be least 2.5 years.2
My takeaway from these findings is to discuss statin initiation for primary prevention for older patients who are focused on longevity, have good functional status (often used in geriatrics as a proxy for prognosis), and are willing to accept more medications.
Empiric Vitamin D Supplementation in Adults over 75 Years
Vitamin D is one of the most common supplements taken by older adults but evidence supporting vitamin D supplementation is variable in published literature, as most data comes from observational trials. New guidelines from the Endocrine Society focused on developing recommendations for healthy individuals with data obtained from randomized controlled trials (RCTs) and large longitudinal observational trials with comparison groups if RCTs were not available. These guidelines recommend against empiric supplementation of vitamin D for healthy adults aged 18-74, excluding pregnant women and patients with high-risk diabetes.3
For older adults aged 75 or greater, empiric vitamin D supplementation is recommended because of the possible reduction of risk in all-cause mortality in this population. Of note, this was a grade 2 recommendation by the panel, indicating that the benefits of the treatment probably outweigh the risks. The panel stated that vitamin D supplementation could be delivered through fortified foods, multivitamins with vitamin D, or as a separate vitamin D supplement.
The dosage should remain within the recommended daily allowance outlined by the Institute of Medicine of 800 IU daily for adults over 70, and the panel recommends low-dose daily vitamin D supplementation over high-dose interval supplementation. The panel noted that routine screening of vitamin D levels should not be used to guide decision-making on whether to start supplementation, but vitamin D levels should be obtained for patients who have an indication for evaluation.
The reviewers highlight that these guidelines were developed for healthy individuals and are not applicable to those with conditions that warrant vitamin D evaluation. In my clinical practice, many of my patients have bone-mineral conditions and cognitive impairment that warrant evaluation. Based on these guidelines, I will consider empiric vitamin D supplementation more often for healthy patients aged 75 and older.
Sedentary Behaviors and Healthy Aging
Engaging inactive older adults in regular physical activity can be challenging, particularly as the pandemic has led to more pervasive social isolation and affected the availability of in-person exercise activities in the community. Physical activity is a key component of healthy aging and cognition, and its benefits should be a part of routine counseling for older adults.
An interesting recent study published in JAMA Network Open by Shi et al. evaluated the association of health behaviors and aging in female US nurses over a 20-year period.4 Surveys were administered to capture time spent in each behavior, such as being sedentary (TV watching, sitting at home or at work), light activity (walking around the house or at work), and moderate to vigorous activity (walking for exercise, lawn mowing). “Healthy aging” was defined by the absence of chronic conditions such as heart failure, and lack of physical, mental, and cognitive impairment.
The study found that participants who were more sedentary were less likely to age healthfully, with each additional 2 hours of TV watching per day associated with a 12% reduction in likelihood of healthy aging. Light physical activity was associated with a significant increase in healthy aging, with a 6% increase in the likelihood of healthy aging for each additional 2 hours of light activity. Each additional 1 hour of moderate to vigorous activity was associated with a 14% increase in the likelihood of healthy aging. These findings support discussions with patients that behavior change, even in small increments, can be beneficial in healthy aging.
References
1. Xu W et al. Ann Intern Med. 2024 Jun;177(6):701-10.
2. Yourman LC et al. JAMA Intern Med. 2021;181:179-85.
3. Demay MB et al. J Clin Endocrinol Metab. August 2024;109(8):1907-47.
4. Shi H et al. JAMA Netw Open. 2024;7(6):e2416300.
In this article, I will review several recently published articles and guidelines relevant to the care of older adults in primary care. The articles of interest address statins for primary prevention, vitamin D supplementation and testing, and physical activity for healthy aging.
Statins for Primary Prevention of Cardiovascular Disease
A common conundrum in primary care is whether an older adult should be on a statin for primary prevention. This question has been difficult to answer because of the underrepresentation of older adults in clinical trials that examine the effect of statins for primary prevention. A recent study by Xu et al. published in Annals of Internal Medicine sought to address this gap in knowledge, investigating the risks and benefits of statins for primary prevention for older adults.1
This study stratified participants by “old” (aged 75-84 years) and “very old” (85 years or older). In this study, older adults who had an indication for statins were initiated on statins and studied over a 5-year period and compared with age-matched cohorts not initiated on statin therapy. Participants with known cardiovascular disease at baseline were excluded. The outcomes of interest were major cardiovascular disease (CVD) (a composite of myocardial infarction, stroke, or heart failure), all-cause mortality, and adverse effect of drug therapy (myopathy or liver dysfunction).
The study found that among older adults aged 75-84, initiation of statin therapy led to a 1.2% risk reduction in major CVD over a 5-year period. For older adults aged 85 and greater, initiation of statins had an even larger impact, leading to a 4.4% risk reduction in major CVD over a 5-year period. The study found that there was no significant difference in adverse effects including myopathy or liver dysfunction in both age groups.
Statins, the study suggests, are appropriate and safe to initiate for primary prevention in older adults and can lead to substantial benefits in reduction of CVD. While time to benefit was not explicitly examined in this study, a prior study by Yourman et al. suggested that the time to benefit for statins for primary prevention in adults aged 50-75 would be least 2.5 years.2
My takeaway from these findings is to discuss statin initiation for primary prevention for older patients who are focused on longevity, have good functional status (often used in geriatrics as a proxy for prognosis), and are willing to accept more medications.
Empiric Vitamin D Supplementation in Adults over 75 Years
Vitamin D is one of the most common supplements taken by older adults but evidence supporting vitamin D supplementation is variable in published literature, as most data comes from observational trials. New guidelines from the Endocrine Society focused on developing recommendations for healthy individuals with data obtained from randomized controlled trials (RCTs) and large longitudinal observational trials with comparison groups if RCTs were not available. These guidelines recommend against empiric supplementation of vitamin D for healthy adults aged 18-74, excluding pregnant women and patients with high-risk diabetes.3
For older adults aged 75 or greater, empiric vitamin D supplementation is recommended because of the possible reduction of risk in all-cause mortality in this population. Of note, this was a grade 2 recommendation by the panel, indicating that the benefits of the treatment probably outweigh the risks. The panel stated that vitamin D supplementation could be delivered through fortified foods, multivitamins with vitamin D, or as a separate vitamin D supplement.
The dosage should remain within the recommended daily allowance outlined by the Institute of Medicine of 800 IU daily for adults over 70, and the panel recommends low-dose daily vitamin D supplementation over high-dose interval supplementation. The panel noted that routine screening of vitamin D levels should not be used to guide decision-making on whether to start supplementation, but vitamin D levels should be obtained for patients who have an indication for evaluation.
The reviewers highlight that these guidelines were developed for healthy individuals and are not applicable to those with conditions that warrant vitamin D evaluation. In my clinical practice, many of my patients have bone-mineral conditions and cognitive impairment that warrant evaluation. Based on these guidelines, I will consider empiric vitamin D supplementation more often for healthy patients aged 75 and older.
Sedentary Behaviors and Healthy Aging
Engaging inactive older adults in regular physical activity can be challenging, particularly as the pandemic has led to more pervasive social isolation and affected the availability of in-person exercise activities in the community. Physical activity is a key component of healthy aging and cognition, and its benefits should be a part of routine counseling for older adults.
An interesting recent study published in JAMA Network Open by Shi et al. evaluated the association of health behaviors and aging in female US nurses over a 20-year period.4 Surveys were administered to capture time spent in each behavior, such as being sedentary (TV watching, sitting at home or at work), light activity (walking around the house or at work), and moderate to vigorous activity (walking for exercise, lawn mowing). “Healthy aging” was defined by the absence of chronic conditions such as heart failure, and lack of physical, mental, and cognitive impairment.
The study found that participants who were more sedentary were less likely to age healthfully, with each additional 2 hours of TV watching per day associated with a 12% reduction in likelihood of healthy aging. Light physical activity was associated with a significant increase in healthy aging, with a 6% increase in the likelihood of healthy aging for each additional 2 hours of light activity. Each additional 1 hour of moderate to vigorous activity was associated with a 14% increase in the likelihood of healthy aging. These findings support discussions with patients that behavior change, even in small increments, can be beneficial in healthy aging.
References
1. Xu W et al. Ann Intern Med. 2024 Jun;177(6):701-10.
2. Yourman LC et al. JAMA Intern Med. 2021;181:179-85.
3. Demay MB et al. J Clin Endocrinol Metab. August 2024;109(8):1907-47.
4. Shi H et al. JAMA Netw Open. 2024;7(6):e2416300.
Atogepant May Prevent Rebound Headache From Medication Overuse in Chronic Migraine
The oral calcitonin gene-related peptide receptor antagonist atogepant is effective in preventing rebound headache related to medication overuse in patients with chronic migraine (CM), new research suggested.
Results of a subgroup analysis of a phase 3, 12-week randomized, double-blind, placebo-controlled trial showed up to a 62% reduction in the proportion of atogepant-treated participants who met acute medication overuse criteria.
“Based on our findings, treatment with atogepant may potentially decrease the risk of developing rebound headache by reducing the use of pain medications,” principal investigator Peter Goadsby, MD, PhD, of King’s College London, London, England, said in a news release.
The study was published online in Neurology.
Effective Prevention Needed
Acute treatments for migraine can mitigate symptoms and reduce disability but can also be ineffective and even result in increased dosing and overuse of these medications, the investigators noted.
Acute medication overuse is defined as “taking simple analgesics for ≥ 15 days per month or taking triptans, ergots, opioids, or combinations of medications for ≥ 10 days per month.”
“There is a high prevalence of pain medication overuse among people with migraine as they try to manage what are often debilitating symptoms,” Dr. Goadsby said. “However, medication overuse can lead to more headaches, called rebound headaches, so more effective preventive treatments are needed.”
Atogepant was developed for migraine prevention in adults. It had been studied in the phase 3 PROGRESS trial, which showed it significantly reduced monthly migraine days (MMDs) compared with placebo during the 12-week trial.
The new subgroup analysis of the study focused specifically on the efficacy and safety of atogepant vs placebo in participants with CM with, and without, medication overuse.
Participants (mean age, 42.1 years; 87.6% women) were randomized to receive either atogepant 30 mg twice daily (n = 253), atogepant 60 mg once daily (n = 256), or placebo (n = 240), with baseline demographics and clinical characteristics similar across all treatment arms. A total of 66.2% met baseline acute medication overuse criteria.
Participants were asked to record migraine and headache experiences in an electronic diary.
‘Effective and Safe’
Participants in both atogepant groups experienced fewer monthly headache days (MHDs) than those in the placebo group, with a least squares mean difference (LSMD) of −2.7 (95% confidence interval [CI], −4.0 to −1.4) in the atogepant 30 mg twice daily group and −1.9 (95% CI, −3.2 to −0.6) in the atogepant 60 mg once daily group.
MHDs were also reduced in both treatment groups, with LSMDs of −2.8 (95% CI, −4.0 to −1.5) and −2.1 (95% CI, −3.3 to −0.8), respectively. Mean acute medication use days were lower in both the treatment groups, with LSMDs of −2.8 (95% CI, −4.1 to −1.6) and −2.6 (95% CI, −3.9 to −1.3), respectively.
A higher proportion of participants achieved a ≥ 50% reduction in MMDs with atogepant 30 mg twice daily (odds ratio [OR], 2.5; 95% CI, 1.5-4.0) and atogepant 60 mg once daily (OR, 2.3; 95% CI, 1.4-3.7).
Notably, the researchers found a 52.1%-61.9% reduction in the proportion of atogepant-treated participants meeting acute medication overuse criteria during the study period vs 38.3% in the placebo group.
Similar results were observed in the subgroup without acute medication overuse.
Treatment-emergent adverse events were reported by 55.8% of participants treated with atogepant 30 mg twice daily, 66.1% with atogepant 60 mg once daily, and 48.5% with placebo in the acute medication overuse subgroup, with similar reports in the non-overuse subgroup.
A limitation cited by the authors was that participants’ self-report of migraines and headaches via electronic diaries might have been inaccurate.
Nevertheless, they concluded that the results showed atogepant to be an “effective and safe” preventive treatment for patients with CM with, and without, acute medication overuse.
AbbVie funded this study and participated in the study design, research, analysis, data collection, interpretation of data, reviewing, and approval of the publication. No honoraria or payments were made for authorship. Dr. Goadsby received personal fees from AbbVie during the conduct of the study, and over the last 36 months, he received a research grant from Celgene; personal fees from Aeon Biopharma, Amgen, CoolTechLLC, Dr. Reddy’s, Eli Lilly and Company, Epalex, Lundbeck, Novartis, Pfizer, Praxis, Sanofi, Satsuma, ShiraTronics, Teva Pharmaceuticals, and Tremeau; personal fees for advice through Gerson Lehrman Group, Guidepoint, SAI Med Partners, and Vector Metric; fees for educational materials from CME Outfitters; and publishing royalties or fees from Massachusetts Medical Society, Oxford University Press, UpToDate, and Wolters Kluwer. The other authors’ disclosures are listed on the original paper.
A version of this article first appeared on Medscape.com.
The oral calcitonin gene-related peptide receptor antagonist atogepant is effective in preventing rebound headache related to medication overuse in patients with chronic migraine (CM), new research suggested.
Results of a subgroup analysis of a phase 3, 12-week randomized, double-blind, placebo-controlled trial showed up to a 62% reduction in the proportion of atogepant-treated participants who met acute medication overuse criteria.
“Based on our findings, treatment with atogepant may potentially decrease the risk of developing rebound headache by reducing the use of pain medications,” principal investigator Peter Goadsby, MD, PhD, of King’s College London, London, England, said in a news release.
The study was published online in Neurology.
Effective Prevention Needed
Acute treatments for migraine can mitigate symptoms and reduce disability but can also be ineffective and even result in increased dosing and overuse of these medications, the investigators noted.
Acute medication overuse is defined as “taking simple analgesics for ≥ 15 days per month or taking triptans, ergots, opioids, or combinations of medications for ≥ 10 days per month.”
“There is a high prevalence of pain medication overuse among people with migraine as they try to manage what are often debilitating symptoms,” Dr. Goadsby said. “However, medication overuse can lead to more headaches, called rebound headaches, so more effective preventive treatments are needed.”
Atogepant was developed for migraine prevention in adults. It had been studied in the phase 3 PROGRESS trial, which showed it significantly reduced monthly migraine days (MMDs) compared with placebo during the 12-week trial.
The new subgroup analysis of the study focused specifically on the efficacy and safety of atogepant vs placebo in participants with CM with, and without, medication overuse.
Participants (mean age, 42.1 years; 87.6% women) were randomized to receive either atogepant 30 mg twice daily (n = 253), atogepant 60 mg once daily (n = 256), or placebo (n = 240), with baseline demographics and clinical characteristics similar across all treatment arms. A total of 66.2% met baseline acute medication overuse criteria.
Participants were asked to record migraine and headache experiences in an electronic diary.
‘Effective and Safe’
Participants in both atogepant groups experienced fewer monthly headache days (MHDs) than those in the placebo group, with a least squares mean difference (LSMD) of −2.7 (95% confidence interval [CI], −4.0 to −1.4) in the atogepant 30 mg twice daily group and −1.9 (95% CI, −3.2 to −0.6) in the atogepant 60 mg once daily group.
MHDs were also reduced in both treatment groups, with LSMDs of −2.8 (95% CI, −4.0 to −1.5) and −2.1 (95% CI, −3.3 to −0.8), respectively. Mean acute medication use days were lower in both the treatment groups, with LSMDs of −2.8 (95% CI, −4.1 to −1.6) and −2.6 (95% CI, −3.9 to −1.3), respectively.
A higher proportion of participants achieved a ≥ 50% reduction in MMDs with atogepant 30 mg twice daily (odds ratio [OR], 2.5; 95% CI, 1.5-4.0) and atogepant 60 mg once daily (OR, 2.3; 95% CI, 1.4-3.7).
Notably, the researchers found a 52.1%-61.9% reduction in the proportion of atogepant-treated participants meeting acute medication overuse criteria during the study period vs 38.3% in the placebo group.
Similar results were observed in the subgroup without acute medication overuse.
Treatment-emergent adverse events were reported by 55.8% of participants treated with atogepant 30 mg twice daily, 66.1% with atogepant 60 mg once daily, and 48.5% with placebo in the acute medication overuse subgroup, with similar reports in the non-overuse subgroup.
A limitation cited by the authors was that participants’ self-report of migraines and headaches via electronic diaries might have been inaccurate.
Nevertheless, they concluded that the results showed atogepant to be an “effective and safe” preventive treatment for patients with CM with, and without, acute medication overuse.
AbbVie funded this study and participated in the study design, research, analysis, data collection, interpretation of data, reviewing, and approval of the publication. No honoraria or payments were made for authorship. Dr. Goadsby received personal fees from AbbVie during the conduct of the study, and over the last 36 months, he received a research grant from Celgene; personal fees from Aeon Biopharma, Amgen, CoolTechLLC, Dr. Reddy’s, Eli Lilly and Company, Epalex, Lundbeck, Novartis, Pfizer, Praxis, Sanofi, Satsuma, ShiraTronics, Teva Pharmaceuticals, and Tremeau; personal fees for advice through Gerson Lehrman Group, Guidepoint, SAI Med Partners, and Vector Metric; fees for educational materials from CME Outfitters; and publishing royalties or fees from Massachusetts Medical Society, Oxford University Press, UpToDate, and Wolters Kluwer. The other authors’ disclosures are listed on the original paper.
A version of this article first appeared on Medscape.com.
The oral calcitonin gene-related peptide receptor antagonist atogepant is effective in preventing rebound headache related to medication overuse in patients with chronic migraine (CM), new research suggested.
Results of a subgroup analysis of a phase 3, 12-week randomized, double-blind, placebo-controlled trial showed up to a 62% reduction in the proportion of atogepant-treated participants who met acute medication overuse criteria.
“Based on our findings, treatment with atogepant may potentially decrease the risk of developing rebound headache by reducing the use of pain medications,” principal investigator Peter Goadsby, MD, PhD, of King’s College London, London, England, said in a news release.
The study was published online in Neurology.
Effective Prevention Needed
Acute treatments for migraine can mitigate symptoms and reduce disability but can also be ineffective and even result in increased dosing and overuse of these medications, the investigators noted.
Acute medication overuse is defined as “taking simple analgesics for ≥ 15 days per month or taking triptans, ergots, opioids, or combinations of medications for ≥ 10 days per month.”
“There is a high prevalence of pain medication overuse among people with migraine as they try to manage what are often debilitating symptoms,” Dr. Goadsby said. “However, medication overuse can lead to more headaches, called rebound headaches, so more effective preventive treatments are needed.”
Atogepant was developed for migraine prevention in adults. It had been studied in the phase 3 PROGRESS trial, which showed it significantly reduced monthly migraine days (MMDs) compared with placebo during the 12-week trial.
The new subgroup analysis of the study focused specifically on the efficacy and safety of atogepant vs placebo in participants with CM with, and without, medication overuse.
Participants (mean age, 42.1 years; 87.6% women) were randomized to receive either atogepant 30 mg twice daily (n = 253), atogepant 60 mg once daily (n = 256), or placebo (n = 240), with baseline demographics and clinical characteristics similar across all treatment arms. A total of 66.2% met baseline acute medication overuse criteria.
Participants were asked to record migraine and headache experiences in an electronic diary.
‘Effective and Safe’
Participants in both atogepant groups experienced fewer monthly headache days (MHDs) than those in the placebo group, with a least squares mean difference (LSMD) of −2.7 (95% confidence interval [CI], −4.0 to −1.4) in the atogepant 30 mg twice daily group and −1.9 (95% CI, −3.2 to −0.6) in the atogepant 60 mg once daily group.
MHDs were also reduced in both treatment groups, with LSMDs of −2.8 (95% CI, −4.0 to −1.5) and −2.1 (95% CI, −3.3 to −0.8), respectively. Mean acute medication use days were lower in both the treatment groups, with LSMDs of −2.8 (95% CI, −4.1 to −1.6) and −2.6 (95% CI, −3.9 to −1.3), respectively.
A higher proportion of participants achieved a ≥ 50% reduction in MMDs with atogepant 30 mg twice daily (odds ratio [OR], 2.5; 95% CI, 1.5-4.0) and atogepant 60 mg once daily (OR, 2.3; 95% CI, 1.4-3.7).
Notably, the researchers found a 52.1%-61.9% reduction in the proportion of atogepant-treated participants meeting acute medication overuse criteria during the study period vs 38.3% in the placebo group.
Similar results were observed in the subgroup without acute medication overuse.
Treatment-emergent adverse events were reported by 55.8% of participants treated with atogepant 30 mg twice daily, 66.1% with atogepant 60 mg once daily, and 48.5% with placebo in the acute medication overuse subgroup, with similar reports in the non-overuse subgroup.
A limitation cited by the authors was that participants’ self-report of migraines and headaches via electronic diaries might have been inaccurate.
Nevertheless, they concluded that the results showed atogepant to be an “effective and safe” preventive treatment for patients with CM with, and without, acute medication overuse.
AbbVie funded this study and participated in the study design, research, analysis, data collection, interpretation of data, reviewing, and approval of the publication. No honoraria or payments were made for authorship. Dr. Goadsby received personal fees from AbbVie during the conduct of the study, and over the last 36 months, he received a research grant from Celgene; personal fees from Aeon Biopharma, Amgen, CoolTechLLC, Dr. Reddy’s, Eli Lilly and Company, Epalex, Lundbeck, Novartis, Pfizer, Praxis, Sanofi, Satsuma, ShiraTronics, Teva Pharmaceuticals, and Tremeau; personal fees for advice through Gerson Lehrman Group, Guidepoint, SAI Med Partners, and Vector Metric; fees for educational materials from CME Outfitters; and publishing royalties or fees from Massachusetts Medical Society, Oxford University Press, UpToDate, and Wolters Kluwer. The other authors’ disclosures are listed on the original paper.
A version of this article first appeared on Medscape.com.
FROM NEUROLOGY
TBI Significantly Increases Mortality Rate Among Veterans With Epilepsy
recent research published in Epilepsia.
, according toIn a retrospective cohort study, Ali Roghani, PhD, of the division of epidemiology at the University of Utah School of Medicine in Salt Lake City, and colleagues evaluated 938,890 veterans between 2000 and 2019 in the Defense Health Agency and the Veterans Health Administration who served in the US military after the September 11 attacks. Overall, 27,436 veterans met criteria for a diagnosis of epilepsy, 264,890 had received a diagnosis for a traumatic brain injury (TBI), and the remaining patients had neither epilepsy nor TBI.
Among the veterans with no epilepsy, 248,714 veterans had a TBI diagnosis, while in the group of patients with epilepsy, 10,358 veterans experienced a TBI before their epilepsy diagnosis, 1598 were diagnosed with a TBI within 6 months of epilepsy, and 4310 veterans had a TBI 6 months after an epilepsy diagnosis. The researchers assessed all-cause mortality in each group, calculating cumulative mortality rates compared with the group of veterans who had no TBI and no epilepsy diagnosis.
Dr. Roghani and colleagues found a significantly higher mortality rate among veterans who developed epilepsy compared with a control group with neither epilepsy nor TBI (6.26% vs. 1.12%; P < .01), with a majority of veterans in the group who died being White (67.4%) men (89.9%). Compared with veterans who were deceased, nondeceased veterans were significantly more likely to have a history of being deployed (70.7% vs. 64.8%; P < .001), were less likely to be in the army (52.2% vs. 55.0%; P < .001), and were more likely to reach the rank of officer or warrant officer (8.1% vs. 7.6%; P = .014).
There were also significant differences in clinical characteristics between nondeceased and deceased veterans, including a higher rate of substance abuse disorder, smoking history, cardiovascular disease, stroke, transient ischemic attack, cancer, liver disease, kidney disease, or other injury as well as overdose, suicidal ideation, and homelessness. “Most clinical conditions were significantly different between deceased and nondeceased in part due to the large cohort size,” the researchers said.
After performing Cox regression analyses, the researchers found a higher mortality risk in veterans with epilepsy and/or TBIs among those who developed a TBI within 6 months of an epilepsy diagnosis (hazard ratio [HR], 5.02; 95% CI, 4.21-5.99), had a TBI prior to epilepsy (HR, 4.25; 95% CI, 3.89-4.58), had epilepsy alone (HR, 4.00; 95% CI, 3.67-4.36), had a TBI more than 6 months after an epilepsy diagnosis (HR, 2.49; 95% CI, 2.17-2.85), and those who had epilepsy alone (HR, 1.30; 95% CI, 1.25-1.36) compared with veterans who had neither epilepsy nor a TBI.
“The temporal relationship with TBI that occurred within 6 months after epilepsy diagnosis may suggest an increased vulnerability to accidents, severe injuries, or TBI resulting from seizures, potentially elevating mortality risk,” Dr. Roghani and colleagues wrote.
The researchers said the results “raise concerns” about the subgroup of patients who are diagnosed with epilepsy close to experiencing a TBI.
“Our results provide information regarding the temporal relationship between epilepsy and TBI regarding mortality in a cohort of post-9/11 veterans, which highlights the need for enhanced primary prevention, such as more access to health care among people with epilepsy and TBI,” they said. “Given the rising incidence of TBI in both the military and civilian populations, these findings suggest close monitoring might be crucial to develop effective prevention strategies for long-term complications, particularly [post-traumatic epilepsy].”
Reevaluating the Treatment of Epilepsy
Juliann Paolicchi, MD, a neurologist and member of the epilepsy team at Northwell Health in New York, who was not involved with the study, said in an interview that TBIs have been studied more closely since the beginning of conflicts in the Middle East, particularly in Iran and Afghanistan, where “newer artillery causes more diffuse traumatic injury to the brain and the body than the effects of more typical weaponry.”
The study by Roghani and colleagues, she said, “is groundbreaking in that it looks at the connection and timing of these two disruptive forces, epilepsy and TBI, on the brain,” she said. “The study reveals that timing is everything: The combination of two disrupting circuitry effects in proximity can have a deadly effect. The summation is greater than either alone in veterans, and has significant effects on the brain’s ability to sustain the functions that keep us alive.”
The 6 months following either a diagnosis of epilepsy or TBI is “crucial,” Dr. Paolicchi noted. “Military and private citizens should be closely monitored during this period, and the results suggest they should refrain from activities that could predispose to further brain injury.”
In addition, current standards for treatment of epilepsy may need to be reevaluated, she said. “Patients are not always treated with a seizure medication after a first seizure, but perhaps, especially in patients at higher risk for brain injury such as the military and athletes, that policy warrants further examination.”
The findings by Roghani and colleagues may also extend to other groups, such as evaluating athletes after a concussion, patients after they are in a motor vehicle accident, and infants with traumatic brain injury, Dr. Paolicchi said. “The results suggest a reexamining of the proximity [of TBI] and epilepsy in these and other areas,” she noted.
The authors reported personal and institutional relationships in the form of research support and other financial compensation from AbbVie, Biohaven, CURE, Department of Defense, Department of Veterans Affairs (VA), Eisai, Engage, National Institutes of Health, Sanofi, SCS Consulting, Sunovion, and UCB. This study was supported by funding from the Department of Defense, VA Health Systems, and the VA HSR&D Informatics, Decision Enhancement, and Analytic Sciences Center of Innovation. Dr. Paolicchi reports no relevant conflicts of interest.
recent research published in Epilepsia.
, according toIn a retrospective cohort study, Ali Roghani, PhD, of the division of epidemiology at the University of Utah School of Medicine in Salt Lake City, and colleagues evaluated 938,890 veterans between 2000 and 2019 in the Defense Health Agency and the Veterans Health Administration who served in the US military after the September 11 attacks. Overall, 27,436 veterans met criteria for a diagnosis of epilepsy, 264,890 had received a diagnosis for a traumatic brain injury (TBI), and the remaining patients had neither epilepsy nor TBI.
Among the veterans with no epilepsy, 248,714 veterans had a TBI diagnosis, while in the group of patients with epilepsy, 10,358 veterans experienced a TBI before their epilepsy diagnosis, 1598 were diagnosed with a TBI within 6 months of epilepsy, and 4310 veterans had a TBI 6 months after an epilepsy diagnosis. The researchers assessed all-cause mortality in each group, calculating cumulative mortality rates compared with the group of veterans who had no TBI and no epilepsy diagnosis.
Dr. Roghani and colleagues found a significantly higher mortality rate among veterans who developed epilepsy compared with a control group with neither epilepsy nor TBI (6.26% vs. 1.12%; P < .01), with a majority of veterans in the group who died being White (67.4%) men (89.9%). Compared with veterans who were deceased, nondeceased veterans were significantly more likely to have a history of being deployed (70.7% vs. 64.8%; P < .001), were less likely to be in the army (52.2% vs. 55.0%; P < .001), and were more likely to reach the rank of officer or warrant officer (8.1% vs. 7.6%; P = .014).
There were also significant differences in clinical characteristics between nondeceased and deceased veterans, including a higher rate of substance abuse disorder, smoking history, cardiovascular disease, stroke, transient ischemic attack, cancer, liver disease, kidney disease, or other injury as well as overdose, suicidal ideation, and homelessness. “Most clinical conditions were significantly different between deceased and nondeceased in part due to the large cohort size,” the researchers said.
After performing Cox regression analyses, the researchers found a higher mortality risk in veterans with epilepsy and/or TBIs among those who developed a TBI within 6 months of an epilepsy diagnosis (hazard ratio [HR], 5.02; 95% CI, 4.21-5.99), had a TBI prior to epilepsy (HR, 4.25; 95% CI, 3.89-4.58), had epilepsy alone (HR, 4.00; 95% CI, 3.67-4.36), had a TBI more than 6 months after an epilepsy diagnosis (HR, 2.49; 95% CI, 2.17-2.85), and those who had epilepsy alone (HR, 1.30; 95% CI, 1.25-1.36) compared with veterans who had neither epilepsy nor a TBI.
“The temporal relationship with TBI that occurred within 6 months after epilepsy diagnosis may suggest an increased vulnerability to accidents, severe injuries, or TBI resulting from seizures, potentially elevating mortality risk,” Dr. Roghani and colleagues wrote.
The researchers said the results “raise concerns” about the subgroup of patients who are diagnosed with epilepsy close to experiencing a TBI.
“Our results provide information regarding the temporal relationship between epilepsy and TBI regarding mortality in a cohort of post-9/11 veterans, which highlights the need for enhanced primary prevention, such as more access to health care among people with epilepsy and TBI,” they said. “Given the rising incidence of TBI in both the military and civilian populations, these findings suggest close monitoring might be crucial to develop effective prevention strategies for long-term complications, particularly [post-traumatic epilepsy].”
Reevaluating the Treatment of Epilepsy
Juliann Paolicchi, MD, a neurologist and member of the epilepsy team at Northwell Health in New York, who was not involved with the study, said in an interview that TBIs have been studied more closely since the beginning of conflicts in the Middle East, particularly in Iran and Afghanistan, where “newer artillery causes more diffuse traumatic injury to the brain and the body than the effects of more typical weaponry.”
The study by Roghani and colleagues, she said, “is groundbreaking in that it looks at the connection and timing of these two disruptive forces, epilepsy and TBI, on the brain,” she said. “The study reveals that timing is everything: The combination of two disrupting circuitry effects in proximity can have a deadly effect. The summation is greater than either alone in veterans, and has significant effects on the brain’s ability to sustain the functions that keep us alive.”
The 6 months following either a diagnosis of epilepsy or TBI is “crucial,” Dr. Paolicchi noted. “Military and private citizens should be closely monitored during this period, and the results suggest they should refrain from activities that could predispose to further brain injury.”
In addition, current standards for treatment of epilepsy may need to be reevaluated, she said. “Patients are not always treated with a seizure medication after a first seizure, but perhaps, especially in patients at higher risk for brain injury such as the military and athletes, that policy warrants further examination.”
The findings by Roghani and colleagues may also extend to other groups, such as evaluating athletes after a concussion, patients after they are in a motor vehicle accident, and infants with traumatic brain injury, Dr. Paolicchi said. “The results suggest a reexamining of the proximity [of TBI] and epilepsy in these and other areas,” she noted.
The authors reported personal and institutional relationships in the form of research support and other financial compensation from AbbVie, Biohaven, CURE, Department of Defense, Department of Veterans Affairs (VA), Eisai, Engage, National Institutes of Health, Sanofi, SCS Consulting, Sunovion, and UCB. This study was supported by funding from the Department of Defense, VA Health Systems, and the VA HSR&D Informatics, Decision Enhancement, and Analytic Sciences Center of Innovation. Dr. Paolicchi reports no relevant conflicts of interest.
recent research published in Epilepsia.
, according toIn a retrospective cohort study, Ali Roghani, PhD, of the division of epidemiology at the University of Utah School of Medicine in Salt Lake City, and colleagues evaluated 938,890 veterans between 2000 and 2019 in the Defense Health Agency and the Veterans Health Administration who served in the US military after the September 11 attacks. Overall, 27,436 veterans met criteria for a diagnosis of epilepsy, 264,890 had received a diagnosis for a traumatic brain injury (TBI), and the remaining patients had neither epilepsy nor TBI.
Among the veterans with no epilepsy, 248,714 veterans had a TBI diagnosis, while in the group of patients with epilepsy, 10,358 veterans experienced a TBI before their epilepsy diagnosis, 1598 were diagnosed with a TBI within 6 months of epilepsy, and 4310 veterans had a TBI 6 months after an epilepsy diagnosis. The researchers assessed all-cause mortality in each group, calculating cumulative mortality rates compared with the group of veterans who had no TBI and no epilepsy diagnosis.
Dr. Roghani and colleagues found a significantly higher mortality rate among veterans who developed epilepsy compared with a control group with neither epilepsy nor TBI (6.26% vs. 1.12%; P < .01), with a majority of veterans in the group who died being White (67.4%) men (89.9%). Compared with veterans who were deceased, nondeceased veterans were significantly more likely to have a history of being deployed (70.7% vs. 64.8%; P < .001), were less likely to be in the army (52.2% vs. 55.0%; P < .001), and were more likely to reach the rank of officer or warrant officer (8.1% vs. 7.6%; P = .014).
There were also significant differences in clinical characteristics between nondeceased and deceased veterans, including a higher rate of substance abuse disorder, smoking history, cardiovascular disease, stroke, transient ischemic attack, cancer, liver disease, kidney disease, or other injury as well as overdose, suicidal ideation, and homelessness. “Most clinical conditions were significantly different between deceased and nondeceased in part due to the large cohort size,” the researchers said.
After performing Cox regression analyses, the researchers found a higher mortality risk in veterans with epilepsy and/or TBIs among those who developed a TBI within 6 months of an epilepsy diagnosis (hazard ratio [HR], 5.02; 95% CI, 4.21-5.99), had a TBI prior to epilepsy (HR, 4.25; 95% CI, 3.89-4.58), had epilepsy alone (HR, 4.00; 95% CI, 3.67-4.36), had a TBI more than 6 months after an epilepsy diagnosis (HR, 2.49; 95% CI, 2.17-2.85), and those who had epilepsy alone (HR, 1.30; 95% CI, 1.25-1.36) compared with veterans who had neither epilepsy nor a TBI.
“The temporal relationship with TBI that occurred within 6 months after epilepsy diagnosis may suggest an increased vulnerability to accidents, severe injuries, or TBI resulting from seizures, potentially elevating mortality risk,” Dr. Roghani and colleagues wrote.
The researchers said the results “raise concerns” about the subgroup of patients who are diagnosed with epilepsy close to experiencing a TBI.
“Our results provide information regarding the temporal relationship between epilepsy and TBI regarding mortality in a cohort of post-9/11 veterans, which highlights the need for enhanced primary prevention, such as more access to health care among people with epilepsy and TBI,” they said. “Given the rising incidence of TBI in both the military and civilian populations, these findings suggest close monitoring might be crucial to develop effective prevention strategies for long-term complications, particularly [post-traumatic epilepsy].”
Reevaluating the Treatment of Epilepsy
Juliann Paolicchi, MD, a neurologist and member of the epilepsy team at Northwell Health in New York, who was not involved with the study, said in an interview that TBIs have been studied more closely since the beginning of conflicts in the Middle East, particularly in Iran and Afghanistan, where “newer artillery causes more diffuse traumatic injury to the brain and the body than the effects of more typical weaponry.”
The study by Roghani and colleagues, she said, “is groundbreaking in that it looks at the connection and timing of these two disruptive forces, epilepsy and TBI, on the brain,” she said. “The study reveals that timing is everything: The combination of two disrupting circuitry effects in proximity can have a deadly effect. The summation is greater than either alone in veterans, and has significant effects on the brain’s ability to sustain the functions that keep us alive.”
The 6 months following either a diagnosis of epilepsy or TBI is “crucial,” Dr. Paolicchi noted. “Military and private citizens should be closely monitored during this period, and the results suggest they should refrain from activities that could predispose to further brain injury.”
In addition, current standards for treatment of epilepsy may need to be reevaluated, she said. “Patients are not always treated with a seizure medication after a first seizure, but perhaps, especially in patients at higher risk for brain injury such as the military and athletes, that policy warrants further examination.”
The findings by Roghani and colleagues may also extend to other groups, such as evaluating athletes after a concussion, patients after they are in a motor vehicle accident, and infants with traumatic brain injury, Dr. Paolicchi said. “The results suggest a reexamining of the proximity [of TBI] and epilepsy in these and other areas,” she noted.
The authors reported personal and institutional relationships in the form of research support and other financial compensation from AbbVie, Biohaven, CURE, Department of Defense, Department of Veterans Affairs (VA), Eisai, Engage, National Institutes of Health, Sanofi, SCS Consulting, Sunovion, and UCB. This study was supported by funding from the Department of Defense, VA Health Systems, and the VA HSR&D Informatics, Decision Enhancement, and Analytic Sciences Center of Innovation. Dr. Paolicchi reports no relevant conflicts of interest.
FROM EPILEPSIA
Study Estimates Global Prevalence of Seborrheic Dermatitis
TOPLINE:
, according to a meta-analysis that also found a higher prevalence in adults than in children.
METHODOLOGY:
- Researchers conducted a meta-analysis of 121 studies, which included 1,260,163 people with clinician-diagnosed seborrheic dermatitis.
- The included studies represented nine countries; most were from India (n = 18), Turkey (n = 13), and the United States (n = 8).
- The primary outcome was the pooled prevalence of seborrheic dermatitis.
TAKEAWAY:
- The overall pooled prevalence of seborrheic dermatitis was 4.38%, 4.08% in clinical settings, and 4.71% in the studies conducted in the general population.
- The prevalence of seborrheic dermatitis was higher among adults (5.64%) than in children (3.7%) and neonates (0.23%).
- A significant variation was observed across countries, with South Africa having the highest prevalence at 8.82%, followed by the United States at 5.86% and Turkey at 3.74%, while India had the lowest prevalence at 2.62%.
IN PRACTICE:
The global prevalence in this meta-analysis was “higher than previous large-scale global estimates, with notable geographic and sociodemographic variability, highlighting the potential impact of environmental factors and cultural practices,” the authors wrote.
SOURCE:
The study was led by Meredith Tyree Polaskey, MS, Chicago Medical School, Rosalind Franklin University of Medicine and Science, North Chicago, Illinois, and was published online on July 3, 2024, in the JAMA Dermatology.
LIMITATIONS:
Interpretation of the findings is limited by research gaps in Central Asia, much of Sub-Saharan Africa, Eastern Europe, Southeast Asia, Latin America (excluding Brazil), and the Caribbean, along with potential underreporting in regions with restricted healthcare access and significant heterogeneity across studies.
DISCLOSURES:
Funding information was not available. One author reported serving as an advisor, consultant, speaker, and/or investigator for multiple pharmaceutical companies, including AbbVie, Amgen, and Pfizer.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
, according to a meta-analysis that also found a higher prevalence in adults than in children.
METHODOLOGY:
- Researchers conducted a meta-analysis of 121 studies, which included 1,260,163 people with clinician-diagnosed seborrheic dermatitis.
- The included studies represented nine countries; most were from India (n = 18), Turkey (n = 13), and the United States (n = 8).
- The primary outcome was the pooled prevalence of seborrheic dermatitis.
TAKEAWAY:
- The overall pooled prevalence of seborrheic dermatitis was 4.38%, 4.08% in clinical settings, and 4.71% in the studies conducted in the general population.
- The prevalence of seborrheic dermatitis was higher among adults (5.64%) than in children (3.7%) and neonates (0.23%).
- A significant variation was observed across countries, with South Africa having the highest prevalence at 8.82%, followed by the United States at 5.86% and Turkey at 3.74%, while India had the lowest prevalence at 2.62%.
IN PRACTICE:
The global prevalence in this meta-analysis was “higher than previous large-scale global estimates, with notable geographic and sociodemographic variability, highlighting the potential impact of environmental factors and cultural practices,” the authors wrote.
SOURCE:
The study was led by Meredith Tyree Polaskey, MS, Chicago Medical School, Rosalind Franklin University of Medicine and Science, North Chicago, Illinois, and was published online on July 3, 2024, in the JAMA Dermatology.
LIMITATIONS:
Interpretation of the findings is limited by research gaps in Central Asia, much of Sub-Saharan Africa, Eastern Europe, Southeast Asia, Latin America (excluding Brazil), and the Caribbean, along with potential underreporting in regions with restricted healthcare access and significant heterogeneity across studies.
DISCLOSURES:
Funding information was not available. One author reported serving as an advisor, consultant, speaker, and/or investigator for multiple pharmaceutical companies, including AbbVie, Amgen, and Pfizer.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
, according to a meta-analysis that also found a higher prevalence in adults than in children.
METHODOLOGY:
- Researchers conducted a meta-analysis of 121 studies, which included 1,260,163 people with clinician-diagnosed seborrheic dermatitis.
- The included studies represented nine countries; most were from India (n = 18), Turkey (n = 13), and the United States (n = 8).
- The primary outcome was the pooled prevalence of seborrheic dermatitis.
TAKEAWAY:
- The overall pooled prevalence of seborrheic dermatitis was 4.38%, 4.08% in clinical settings, and 4.71% in the studies conducted in the general population.
- The prevalence of seborrheic dermatitis was higher among adults (5.64%) than in children (3.7%) and neonates (0.23%).
- A significant variation was observed across countries, with South Africa having the highest prevalence at 8.82%, followed by the United States at 5.86% and Turkey at 3.74%, while India had the lowest prevalence at 2.62%.
IN PRACTICE:
The global prevalence in this meta-analysis was “higher than previous large-scale global estimates, with notable geographic and sociodemographic variability, highlighting the potential impact of environmental factors and cultural practices,” the authors wrote.
SOURCE:
The study was led by Meredith Tyree Polaskey, MS, Chicago Medical School, Rosalind Franklin University of Medicine and Science, North Chicago, Illinois, and was published online on July 3, 2024, in the JAMA Dermatology.
LIMITATIONS:
Interpretation of the findings is limited by research gaps in Central Asia, much of Sub-Saharan Africa, Eastern Europe, Southeast Asia, Latin America (excluding Brazil), and the Caribbean, along with potential underreporting in regions with restricted healthcare access and significant heterogeneity across studies.
DISCLOSURES:
Funding information was not available. One author reported serving as an advisor, consultant, speaker, and/or investigator for multiple pharmaceutical companies, including AbbVie, Amgen, and Pfizer.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Most Potentially Hepatotoxic Meds Revealed: Real-World Data Analysis
TOPLINE:
An analysis of real-world evidence identified 17 medications, many not previously regarded as potentially hepatotoxic, that have high incidence rates of patient hospitalization for acute liver injury (ALI), offering insights on how to better determine which drugs carry the most significant risk and warrant liver monitoring.
METHODOLOGY:
- Without a systematic approach to classifying medications’ hepatotoxic risk, researchers have used case reports published on the National Institutes of Health’s LiverTox, which doesn’t account for the number of people exposed, to categorize drugs’ likelihood of causing ALI. The objective was to identify the most potentially hepatotoxic medications using real-world incidence rates of severe ALI.
- Researchers analyzed US Department of Veterans Affairs electronic health record data for almost 7.9 million individuals (mean age, 64.4 years; 92.5% men) without preexisting liver or biliary disease who were initiated in an outpatient setting on any one of 194 medications with four or more published reports of hepatotoxicity. Drugs delivered by injection or intravenously, prescribed for alcohol use disorder or liver disease treatment, or used as an anticoagulant were not included in the study.
- The primary outcome measured was hospitalization for severe ALI, defined by alanine aminotransferase levels > 120 U/L and total bilirubin levels > 2.0 mg/dL or the international normalized ratio ≥ 1.5 and total bilirubin levels > 2.0 mg/dL within the first 2 days of admission.
- Researchers organized the medications into groups on the basis of observed rates of severe ALI per 10,000 person-years and classified drugs with 10 or more hospitalizations (group 1) and 5-9.9 hospitalizations (group 2) as the most potentially hepatotoxic. The study period was October 2000 through September 2021.
TAKEAWAY:
- Among the study population, 1739 hospitalizations for severe ALI were identified. Incidence rates of severe ALI varied widely by medication, from 0 to 86.4 events per 10,000 person-years.
- Seventeen medications were classified as the most potentially hepatotoxic (groups 1 and 2). Seven of them (stavudine, erlotinib, lenalidomide or thalidomide, chlorpromazine, metronidazole, prochlorperazine, and isoniazid) had incidence rates of ≥ 10 events per 10,000 person-years. The other 10 medications (moxifloxacin, azathioprine, levofloxacin, clarithromycin, ketoconazole, fluconazole, captopril, amoxicillin-clavulanate, sulfamethoxazole-trimethoprim, and ciprofloxacin) showed incidence rates of 5-9.9 events per 10,000 person-years.
- Of the 17 most hepatotoxic medications, 11 (64%) were not classified as highly hepatotoxic in the published case reports, suggesting a discrepancy between real-world data and case report categorizations.
- Similarly, several medications, including some statins, identified as low-risk in this study were classified as among the most hepatotoxic in the published case reports.
IN PRACTICE:
“Categorization of hepatotoxicity based on the number of published case reports did not accurately reflect observed rates of severe ALI (acute liver injury),” the researchers wrote. “This study represents a systematic, reproducible approach to using real-world data to measure rates of severe ALI following medication initiation among patients without liver or biliary disease…Patients initiating a medication with a high rate of severe ALI might require closer monitoring of liver-related laboratory tests to detect evolving hepatic dysfunction earlier, which might improve prognosis.”
The study illustrates the potential to use electronic health record data to “revolutionize how we characterize drug-related toxic effects,” not just on the liver but other organs, Grace Y. Zhang, MD, and Jessica B. Rubin, MD, MPH, of the University of California, San Francisco, wrote in an accompanying editorial. “If curated and disseminated effectively…such evidence will undoubtedly improve clinical decision-making and allow for more informed patient counseling regarding the true risks of starting or discontinuing medications.
SOURCE:
The study, led by Jessie Torgersen, MD, MHS, MSCE, of the Division of Infectious Diseases, Department of Medicine, Perelman School of Medicine at the University of Pennsylvania, Philadelphia, was published online in JAMA Internal Medicine.
LIMITATIONS:
The researchers listed several limitations, including the possibility that reliance on laboratory tests for ascertainment of acute liver injuries could introduce surveillance bias. The study focused on a population predominantly consisting of men without preexisting liver or biliary disease, so the findings may not be generalizable to women or individuals with liver disease. Additionally, researchers did not perform a causality assessment of all outcomes, did not study medications with fewer than four published case reports, and did not evaluate the influence of dosage.
DISCLOSURES:
This study was partly funded by several grants from the National Institutes of Health. Some authors declared receiving grants and personal fees from some of the funding agencies and other sources outside of this work.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
An analysis of real-world evidence identified 17 medications, many not previously regarded as potentially hepatotoxic, that have high incidence rates of patient hospitalization for acute liver injury (ALI), offering insights on how to better determine which drugs carry the most significant risk and warrant liver monitoring.
METHODOLOGY:
- Without a systematic approach to classifying medications’ hepatotoxic risk, researchers have used case reports published on the National Institutes of Health’s LiverTox, which doesn’t account for the number of people exposed, to categorize drugs’ likelihood of causing ALI. The objective was to identify the most potentially hepatotoxic medications using real-world incidence rates of severe ALI.
- Researchers analyzed US Department of Veterans Affairs electronic health record data for almost 7.9 million individuals (mean age, 64.4 years; 92.5% men) without preexisting liver or biliary disease who were initiated in an outpatient setting on any one of 194 medications with four or more published reports of hepatotoxicity. Drugs delivered by injection or intravenously, prescribed for alcohol use disorder or liver disease treatment, or used as an anticoagulant were not included in the study.
- The primary outcome measured was hospitalization for severe ALI, defined by alanine aminotransferase levels > 120 U/L and total bilirubin levels > 2.0 mg/dL or the international normalized ratio ≥ 1.5 and total bilirubin levels > 2.0 mg/dL within the first 2 days of admission.
- Researchers organized the medications into groups on the basis of observed rates of severe ALI per 10,000 person-years and classified drugs with 10 or more hospitalizations (group 1) and 5-9.9 hospitalizations (group 2) as the most potentially hepatotoxic. The study period was October 2000 through September 2021.
TAKEAWAY:
- Among the study population, 1739 hospitalizations for severe ALI were identified. Incidence rates of severe ALI varied widely by medication, from 0 to 86.4 events per 10,000 person-years.
- Seventeen medications were classified as the most potentially hepatotoxic (groups 1 and 2). Seven of them (stavudine, erlotinib, lenalidomide or thalidomide, chlorpromazine, metronidazole, prochlorperazine, and isoniazid) had incidence rates of ≥ 10 events per 10,000 person-years. The other 10 medications (moxifloxacin, azathioprine, levofloxacin, clarithromycin, ketoconazole, fluconazole, captopril, amoxicillin-clavulanate, sulfamethoxazole-trimethoprim, and ciprofloxacin) showed incidence rates of 5-9.9 events per 10,000 person-years.
- Of the 17 most hepatotoxic medications, 11 (64%) were not classified as highly hepatotoxic in the published case reports, suggesting a discrepancy between real-world data and case report categorizations.
- Similarly, several medications, including some statins, identified as low-risk in this study were classified as among the most hepatotoxic in the published case reports.
IN PRACTICE:
“Categorization of hepatotoxicity based on the number of published case reports did not accurately reflect observed rates of severe ALI (acute liver injury),” the researchers wrote. “This study represents a systematic, reproducible approach to using real-world data to measure rates of severe ALI following medication initiation among patients without liver or biliary disease…Patients initiating a medication with a high rate of severe ALI might require closer monitoring of liver-related laboratory tests to detect evolving hepatic dysfunction earlier, which might improve prognosis.”
The study illustrates the potential to use electronic health record data to “revolutionize how we characterize drug-related toxic effects,” not just on the liver but other organs, Grace Y. Zhang, MD, and Jessica B. Rubin, MD, MPH, of the University of California, San Francisco, wrote in an accompanying editorial. “If curated and disseminated effectively…such evidence will undoubtedly improve clinical decision-making and allow for more informed patient counseling regarding the true risks of starting or discontinuing medications.
SOURCE:
The study, led by Jessie Torgersen, MD, MHS, MSCE, of the Division of Infectious Diseases, Department of Medicine, Perelman School of Medicine at the University of Pennsylvania, Philadelphia, was published online in JAMA Internal Medicine.
LIMITATIONS:
The researchers listed several limitations, including the possibility that reliance on laboratory tests for ascertainment of acute liver injuries could introduce surveillance bias. The study focused on a population predominantly consisting of men without preexisting liver or biliary disease, so the findings may not be generalizable to women or individuals with liver disease. Additionally, researchers did not perform a causality assessment of all outcomes, did not study medications with fewer than four published case reports, and did not evaluate the influence of dosage.
DISCLOSURES:
This study was partly funded by several grants from the National Institutes of Health. Some authors declared receiving grants and personal fees from some of the funding agencies and other sources outside of this work.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
An analysis of real-world evidence identified 17 medications, many not previously regarded as potentially hepatotoxic, that have high incidence rates of patient hospitalization for acute liver injury (ALI), offering insights on how to better determine which drugs carry the most significant risk and warrant liver monitoring.
METHODOLOGY:
- Without a systematic approach to classifying medications’ hepatotoxic risk, researchers have used case reports published on the National Institutes of Health’s LiverTox, which doesn’t account for the number of people exposed, to categorize drugs’ likelihood of causing ALI. The objective was to identify the most potentially hepatotoxic medications using real-world incidence rates of severe ALI.
- Researchers analyzed US Department of Veterans Affairs electronic health record data for almost 7.9 million individuals (mean age, 64.4 years; 92.5% men) without preexisting liver or biliary disease who were initiated in an outpatient setting on any one of 194 medications with four or more published reports of hepatotoxicity. Drugs delivered by injection or intravenously, prescribed for alcohol use disorder or liver disease treatment, or used as an anticoagulant were not included in the study.
- The primary outcome measured was hospitalization for severe ALI, defined by alanine aminotransferase levels > 120 U/L and total bilirubin levels > 2.0 mg/dL or the international normalized ratio ≥ 1.5 and total bilirubin levels > 2.0 mg/dL within the first 2 days of admission.
- Researchers organized the medications into groups on the basis of observed rates of severe ALI per 10,000 person-years and classified drugs with 10 or more hospitalizations (group 1) and 5-9.9 hospitalizations (group 2) as the most potentially hepatotoxic. The study period was October 2000 through September 2021.
TAKEAWAY:
- Among the study population, 1739 hospitalizations for severe ALI were identified. Incidence rates of severe ALI varied widely by medication, from 0 to 86.4 events per 10,000 person-years.
- Seventeen medications were classified as the most potentially hepatotoxic (groups 1 and 2). Seven of them (stavudine, erlotinib, lenalidomide or thalidomide, chlorpromazine, metronidazole, prochlorperazine, and isoniazid) had incidence rates of ≥ 10 events per 10,000 person-years. The other 10 medications (moxifloxacin, azathioprine, levofloxacin, clarithromycin, ketoconazole, fluconazole, captopril, amoxicillin-clavulanate, sulfamethoxazole-trimethoprim, and ciprofloxacin) showed incidence rates of 5-9.9 events per 10,000 person-years.
- Of the 17 most hepatotoxic medications, 11 (64%) were not classified as highly hepatotoxic in the published case reports, suggesting a discrepancy between real-world data and case report categorizations.
- Similarly, several medications, including some statins, identified as low-risk in this study were classified as among the most hepatotoxic in the published case reports.
IN PRACTICE:
“Categorization of hepatotoxicity based on the number of published case reports did not accurately reflect observed rates of severe ALI (acute liver injury),” the researchers wrote. “This study represents a systematic, reproducible approach to using real-world data to measure rates of severe ALI following medication initiation among patients without liver or biliary disease…Patients initiating a medication with a high rate of severe ALI might require closer monitoring of liver-related laboratory tests to detect evolving hepatic dysfunction earlier, which might improve prognosis.”
The study illustrates the potential to use electronic health record data to “revolutionize how we characterize drug-related toxic effects,” not just on the liver but other organs, Grace Y. Zhang, MD, and Jessica B. Rubin, MD, MPH, of the University of California, San Francisco, wrote in an accompanying editorial. “If curated and disseminated effectively…such evidence will undoubtedly improve clinical decision-making and allow for more informed patient counseling regarding the true risks of starting or discontinuing medications.
SOURCE:
The study, led by Jessie Torgersen, MD, MHS, MSCE, of the Division of Infectious Diseases, Department of Medicine, Perelman School of Medicine at the University of Pennsylvania, Philadelphia, was published online in JAMA Internal Medicine.
LIMITATIONS:
The researchers listed several limitations, including the possibility that reliance on laboratory tests for ascertainment of acute liver injuries could introduce surveillance bias. The study focused on a population predominantly consisting of men without preexisting liver or biliary disease, so the findings may not be generalizable to women or individuals with liver disease. Additionally, researchers did not perform a causality assessment of all outcomes, did not study medications with fewer than four published case reports, and did not evaluate the influence of dosage.
DISCLOSURES:
This study was partly funded by several grants from the National Institutes of Health. Some authors declared receiving grants and personal fees from some of the funding agencies and other sources outside of this work.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
What Are the Ethics of Sex and Romance for Older Adults in Nursing Homes?
This transcript has been edited for clarity.
I had a case a couple years ago in which I found myself completely at odds with the person complaining. A daughter came to me and said [paraphrasing], look, my dad is in a nursing home, and he’s just there for care that he needs, but he’s mentally competent. He’s enjoying watching television, playing games. He plays bridge and does many things. The nursing home is letting him have a romantic relationship with a woman who’s also in the nursing home. I think you, ethicist, should both intervene and try to stop that, and write more about the immorality of facilities like nursing homes or other long-term care settings permitting romance or sexual relations to take place.
I was reminded of that case because a report recently appeared that sexually transmitted diseases are on the rise among the elderly, both in nursing homes and in other settings. This obviously is linked up to another technological advance: the erectile dysfunction drugs.
I’m sure there are many men who, at one point in their lives, could not engage in sexual activity due to impotence. We have found a treatment for erectile dysfunction. Loads and loads of men are using it, and we forget that some of them are going to be older. The rate of impotence goes up directly with aging. If you’re in a nursing home, home care, or wherever you are, you may find yourself able to engage in sex in a way that your dad or your granddad may not have been.
We also know — and I found this out when I was tracking sales of erectile dysfunction drugs — that some of these older men are going to visit prostitutes. That’s another route, unsafe sex, for sexual diseases to be spreading into various older communities.
Morally, I think every individual who is competent and wishes to engage in a romantic or sexual relationship should be able to do so. If they’re within a marriage and they want to resume sexual activity because they get better or they can use these drugs, well, that’s great. If they’re single and they’re just living with others and they form an interesting romantic relationship, why shouldn’t they be allowed to engage in sex?
It is not only something that I didn’t agree with the complaining daughter about, but also I think some of these facilities should make more rooms for privacy and more opportunity for intimacy. It’s not like we should tell granddad that he’s living in a college dorm and try to make sure that his roommate doesn’t come in if he’s going to have his girlfriend over.
Are there ethical issues? Sure. Obviously, we should remember, if we have older patients, to talk to them about sexually transmitted diseases as part of a discussion of their sex life. We shouldn’t presume that they’re not doing something. We should presume that they might be, and then remind them about safe sex, particularly if they’re going to use third parties like prostitutes.
Competency becomes important. It’s one thing to have a mutually agreed upon romantic relationship. It’s another thing if somebody is taking advantage of someone who has Alzheimer’s or severe mental dysfunction and they’re not consenting.
How do we determine that and how do we manage that? I think people who are incompetent need to be protected from sexual advances unless they have a relative or someone who says they can engage if they enjoy it and it brings them pleasure. I wouldn’t just have people who are vulnerable, exploited, or acting in a predatory way toward others.
As I said, we need to rethink the design of where older people are living, whether it’s assisted living, nursing home living, or wherever, just to give them the opportunity to have a full life, as any individual would have once they’re past the age of majority, no matter who they want to have romance with and what they want to do in terms of how far that intimacy goes.
Sadly, I didn’t agree with the daughter who came to me and asked me to stop it. I wouldn’t stop it nor would I publish against it. There are risks that we ought to be aware of, including exploiting vulnerable people if they can’t consent, and the danger of transmission of disease, as would be true in any group that might engage in high-risk behavior.
Another risk may be injury if someone is frail and can’t physically sustain sexual intimacy because they’re just too frail to do it. We also need to be sure to address the issue of sexuality with patients to make sure they know what’s going on, what risks there are, what rights they have, and so on.
At the end of the day, I’m not in the camp that says, “Just say no” when it comes to sex among the elderly.
Dr. Caplan is director, Division of Medical Ethics, New York University Langone Medical Center, New York. He has served as a director, officer, partner, employee, advisor, consultant, or trustee for Johnson & Johnson’s Panel for Compassionate Drug Use (unpaid position); he also serves as a contributing author and advisor for Medscape.
A version of this article first appeared on Medscape.com.
This transcript has been edited for clarity.
I had a case a couple years ago in which I found myself completely at odds with the person complaining. A daughter came to me and said [paraphrasing], look, my dad is in a nursing home, and he’s just there for care that he needs, but he’s mentally competent. He’s enjoying watching television, playing games. He plays bridge and does many things. The nursing home is letting him have a romantic relationship with a woman who’s also in the nursing home. I think you, ethicist, should both intervene and try to stop that, and write more about the immorality of facilities like nursing homes or other long-term care settings permitting romance or sexual relations to take place.
I was reminded of that case because a report recently appeared that sexually transmitted diseases are on the rise among the elderly, both in nursing homes and in other settings. This obviously is linked up to another technological advance: the erectile dysfunction drugs.
I’m sure there are many men who, at one point in their lives, could not engage in sexual activity due to impotence. We have found a treatment for erectile dysfunction. Loads and loads of men are using it, and we forget that some of them are going to be older. The rate of impotence goes up directly with aging. If you’re in a nursing home, home care, or wherever you are, you may find yourself able to engage in sex in a way that your dad or your granddad may not have been.
We also know — and I found this out when I was tracking sales of erectile dysfunction drugs — that some of these older men are going to visit prostitutes. That’s another route, unsafe sex, for sexual diseases to be spreading into various older communities.
Morally, I think every individual who is competent and wishes to engage in a romantic or sexual relationship should be able to do so. If they’re within a marriage and they want to resume sexual activity because they get better or they can use these drugs, well, that’s great. If they’re single and they’re just living with others and they form an interesting romantic relationship, why shouldn’t they be allowed to engage in sex?
It is not only something that I didn’t agree with the complaining daughter about, but also I think some of these facilities should make more rooms for privacy and more opportunity for intimacy. It’s not like we should tell granddad that he’s living in a college dorm and try to make sure that his roommate doesn’t come in if he’s going to have his girlfriend over.
Are there ethical issues? Sure. Obviously, we should remember, if we have older patients, to talk to them about sexually transmitted diseases as part of a discussion of their sex life. We shouldn’t presume that they’re not doing something. We should presume that they might be, and then remind them about safe sex, particularly if they’re going to use third parties like prostitutes.
Competency becomes important. It’s one thing to have a mutually agreed upon romantic relationship. It’s another thing if somebody is taking advantage of someone who has Alzheimer’s or severe mental dysfunction and they’re not consenting.
How do we determine that and how do we manage that? I think people who are incompetent need to be protected from sexual advances unless they have a relative or someone who says they can engage if they enjoy it and it brings them pleasure. I wouldn’t just have people who are vulnerable, exploited, or acting in a predatory way toward others.
As I said, we need to rethink the design of where older people are living, whether it’s assisted living, nursing home living, or wherever, just to give them the opportunity to have a full life, as any individual would have once they’re past the age of majority, no matter who they want to have romance with and what they want to do in terms of how far that intimacy goes.
Sadly, I didn’t agree with the daughter who came to me and asked me to stop it. I wouldn’t stop it nor would I publish against it. There are risks that we ought to be aware of, including exploiting vulnerable people if they can’t consent, and the danger of transmission of disease, as would be true in any group that might engage in high-risk behavior.
Another risk may be injury if someone is frail and can’t physically sustain sexual intimacy because they’re just too frail to do it. We also need to be sure to address the issue of sexuality with patients to make sure they know what’s going on, what risks there are, what rights they have, and so on.
At the end of the day, I’m not in the camp that says, “Just say no” when it comes to sex among the elderly.
Dr. Caplan is director, Division of Medical Ethics, New York University Langone Medical Center, New York. He has served as a director, officer, partner, employee, advisor, consultant, or trustee for Johnson & Johnson’s Panel for Compassionate Drug Use (unpaid position); he also serves as a contributing author and advisor for Medscape.
A version of this article first appeared on Medscape.com.
This transcript has been edited for clarity.
I had a case a couple years ago in which I found myself completely at odds with the person complaining. A daughter came to me and said [paraphrasing], look, my dad is in a nursing home, and he’s just there for care that he needs, but he’s mentally competent. He’s enjoying watching television, playing games. He plays bridge and does many things. The nursing home is letting him have a romantic relationship with a woman who’s also in the nursing home. I think you, ethicist, should both intervene and try to stop that, and write more about the immorality of facilities like nursing homes or other long-term care settings permitting romance or sexual relations to take place.
I was reminded of that case because a report recently appeared that sexually transmitted diseases are on the rise among the elderly, both in nursing homes and in other settings. This obviously is linked up to another technological advance: the erectile dysfunction drugs.
I’m sure there are many men who, at one point in their lives, could not engage in sexual activity due to impotence. We have found a treatment for erectile dysfunction. Loads and loads of men are using it, and we forget that some of them are going to be older. The rate of impotence goes up directly with aging. If you’re in a nursing home, home care, or wherever you are, you may find yourself able to engage in sex in a way that your dad or your granddad may not have been.
We also know — and I found this out when I was tracking sales of erectile dysfunction drugs — that some of these older men are going to visit prostitutes. That’s another route, unsafe sex, for sexual diseases to be spreading into various older communities.
Morally, I think every individual who is competent and wishes to engage in a romantic or sexual relationship should be able to do so. If they’re within a marriage and they want to resume sexual activity because they get better or they can use these drugs, well, that’s great. If they’re single and they’re just living with others and they form an interesting romantic relationship, why shouldn’t they be allowed to engage in sex?
It is not only something that I didn’t agree with the complaining daughter about, but also I think some of these facilities should make more rooms for privacy and more opportunity for intimacy. It’s not like we should tell granddad that he’s living in a college dorm and try to make sure that his roommate doesn’t come in if he’s going to have his girlfriend over.
Are there ethical issues? Sure. Obviously, we should remember, if we have older patients, to talk to them about sexually transmitted diseases as part of a discussion of their sex life. We shouldn’t presume that they’re not doing something. We should presume that they might be, and then remind them about safe sex, particularly if they’re going to use third parties like prostitutes.
Competency becomes important. It’s one thing to have a mutually agreed upon romantic relationship. It’s another thing if somebody is taking advantage of someone who has Alzheimer’s or severe mental dysfunction and they’re not consenting.
How do we determine that and how do we manage that? I think people who are incompetent need to be protected from sexual advances unless they have a relative or someone who says they can engage if they enjoy it and it brings them pleasure. I wouldn’t just have people who are vulnerable, exploited, or acting in a predatory way toward others.
As I said, we need to rethink the design of where older people are living, whether it’s assisted living, nursing home living, or wherever, just to give them the opportunity to have a full life, as any individual would have once they’re past the age of majority, no matter who they want to have romance with and what they want to do in terms of how far that intimacy goes.
Sadly, I didn’t agree with the daughter who came to me and asked me to stop it. I wouldn’t stop it nor would I publish against it. There are risks that we ought to be aware of, including exploiting vulnerable people if they can’t consent, and the danger of transmission of disease, as would be true in any group that might engage in high-risk behavior.
Another risk may be injury if someone is frail and can’t physically sustain sexual intimacy because they’re just too frail to do it. We also need to be sure to address the issue of sexuality with patients to make sure they know what’s going on, what risks there are, what rights they have, and so on.
At the end of the day, I’m not in the camp that says, “Just say no” when it comes to sex among the elderly.
Dr. Caplan is director, Division of Medical Ethics, New York University Langone Medical Center, New York. He has served as a director, officer, partner, employee, advisor, consultant, or trustee for Johnson & Johnson’s Panel for Compassionate Drug Use (unpaid position); he also serves as a contributing author and advisor for Medscape.
A version of this article first appeared on Medscape.com.