User login
Is Cushing Syndrome More Common in the US Than We Think?
BOSTON — The prevalence of Cushing syndrome (CS) in the United States may be considerably higher than currently appreciated, new data from a single US institution suggest.
In contrast to estimates of 1 to 3 cases per million patient-years from population-based European studies, researchers at the University of Wisconsin, Milwaukee, estimated that the incidence of CS in Wisconsin is a minimum of 7.2 cases per million patient-years. What’s more, contrary to all previous studies, they found that adrenal Cushing syndrome was more common than pituitary adrenocorticotropic hormone (ACTH)–secreting tumors (Cushing disease), and that fewer than half of individuals with adrenal Cushing syndrome had classic physical features of hypercortisolism, such as weight gain, round face, excessive hair growth, and stretch marks.
“Cases are absolutely being missed. ... Clinicians should realize that cortisol excess is not rare. It may not be common, but it needs to be considered in patients with any constellation of features that are seen in cortisol excess,” study investigator Ty B. Carroll, MD, associate professor of medicine, endocrinology and molecular medicine, and the endocrine fellowship program director at Medical College of Wisconsin in Milwaukee, told this news organization.
There are several contributing factors, he noted, “including the obesity and diabetes epidemics which make some clinical features of cortisol excess more common and less notable. Providers get used to seeing patients with some features of cortisol excess and don’t think to screen. The consequence of this is more difficult-to-control diabetes and hypertension, more advance metabolic bone disease, and likely more advanced cardiovascular disease, all resulting from extended exposure to cortisol excess,” he said.
Are Milder Cases the Ones Being Missed?
Asked to comment, session moderator Sharon L. Wardlaw, MD, professor of medicine at Columbia University College of Physicians and Surgeons, New York City, said, “When we talk about Cushing [syndrome], we usually think of pituitary ACTH as more [common], followed by adrenal adenomas, and then ectopic. But they’re seeing more adrenal adenoma ... we are probably diagnosing this a little more now.”
She also suggested that the Wisconsin group may have a lower threshold for diagnosing the milder cortisol elevation seen with adrenal Cushing syndrome. “If you screen for Cushing with a dexamethasone suppression test … [i]f you have autonomous secretion by the adrenal, you don’t suppress as much. ... When you measure 24-hour urinary cortisol, it may be normal. So you’re in this in-between [state]. ... Maybe in Wisconsin they’re diagnosing it more. Or, maybe it’s just being underdiagnosed in other places.”
She also pointed out that “you can’t diagnose it unless you think of it. I’m not so sure that with these mild cases it’s so much that it’s more common, but maybe it’s like thyroid nodules, where we didn’t know about it until everybody started getting all of these CT scans. We’re now seeing all these incidental thyroid nodules ... I don’t think we’re missing florid Cushing.”
However, Dr. Wardlaw said, it’s probably worthwhile to detect even milder hypercortisolism because it could still have long-term damaging effects, including osteoporosis, muscle weakness, glucose intolerance, and frailty. “You could do something about it and normalize it if you found it. I think that would be the reason to do it.”
Is Wisconsin Representative of Cushing Everywhere?
Dr. Carroll presented the findings at the annual meeting of the Endocrine Society. He began by noting that most of the previous CS incidence studies, with estimates of 1.2-3.2 cases per million per year, come from European data published from 1994 to 2019 and collected as far back as 1955. The method of acquisition of patients and the definitions of confirmed cases varied widely in those studies, which reported CS etiologies of ACTH-secreting neoplasms (pituitary or ectopic) in 75%-85% and adrenal-dependent cortisol excess in 15%-20%.
The current study included data from clinic records between May 1, 2017, and December 31, 2022, of Wisconsin residents newly diagnosed with and treated for CS. The CS diagnosis was established with standard guideline-supported biochemical testing and appropriate imaging. Patients with exogenous and non-neoplastic hypercortisolism and those who did not receive therapy for CS were excluded.
A total of 185 patients (73% female, 27% male) were identified from 27 of the total 72 counties in Wisconsin, representing a population of 4.5 million. On the basis of the total 5.9 million population of Wisconsin, the incidence of CS in the state works out to 7.2 cases per million population per year, Dr. Carroll said.
However, data from the Wisconsin Hospital Association show that the University of Wisconsin’s Milwaukee facility treated just about half of patients in the state who are discharged from the hospital with a diagnosis of CS during 2019-2023. “So ... that means that an actual or approximate incidence of 14-15 cases per million per year rather than the 7.2 cases that we produce,” he said.
Etiologies were 60% adrenal (111 patients), 36.8% pituitary (68 patients), and 3.2% ectopic (6 patients). Those proportions were similar between genders.
On biochemical testing, values for late-night salivary cortisol, dexamethasone suppression, and urinary free cortisol were highest for the ectopic group (3.189 µg/dL, 42.5 µg/dL, and 1514.2 µg/24 h, respectively) and lowest for the adrenal group (0.236 µg/dL, 6.5 µg/dL, and 64.2 µg/24 h, respectively). All differences between groups were highly statistically significant, at P < .0001, Dr. Carroll noted.
Classic physical features of CS were present in 91% of people with pituitary CS and 100% of those ectopic CS but just 44% of individuals with adrenal CS. “We found that adrenal-dependent disease was the most common form of Cushing syndrome. It frequently presented without classic physical features that may be due to the milder biochemical presentation,” he concluded.
Dr. Carroll reported consulting and investigator fees from Corcept Therapeutics. Dr. Wardlaw has no disclosures.
A version of this article appeared on Medscape.com.
BOSTON — The prevalence of Cushing syndrome (CS) in the United States may be considerably higher than currently appreciated, new data from a single US institution suggest.
In contrast to estimates of 1 to 3 cases per million patient-years from population-based European studies, researchers at the University of Wisconsin, Milwaukee, estimated that the incidence of CS in Wisconsin is a minimum of 7.2 cases per million patient-years. What’s more, contrary to all previous studies, they found that adrenal Cushing syndrome was more common than pituitary adrenocorticotropic hormone (ACTH)–secreting tumors (Cushing disease), and that fewer than half of individuals with adrenal Cushing syndrome had classic physical features of hypercortisolism, such as weight gain, round face, excessive hair growth, and stretch marks.
“Cases are absolutely being missed. ... Clinicians should realize that cortisol excess is not rare. It may not be common, but it needs to be considered in patients with any constellation of features that are seen in cortisol excess,” study investigator Ty B. Carroll, MD, associate professor of medicine, endocrinology and molecular medicine, and the endocrine fellowship program director at Medical College of Wisconsin in Milwaukee, told this news organization.
There are several contributing factors, he noted, “including the obesity and diabetes epidemics which make some clinical features of cortisol excess more common and less notable. Providers get used to seeing patients with some features of cortisol excess and don’t think to screen. The consequence of this is more difficult-to-control diabetes and hypertension, more advance metabolic bone disease, and likely more advanced cardiovascular disease, all resulting from extended exposure to cortisol excess,” he said.
Are Milder Cases the Ones Being Missed?
Asked to comment, session moderator Sharon L. Wardlaw, MD, professor of medicine at Columbia University College of Physicians and Surgeons, New York City, said, “When we talk about Cushing [syndrome], we usually think of pituitary ACTH as more [common], followed by adrenal adenomas, and then ectopic. But they’re seeing more adrenal adenoma ... we are probably diagnosing this a little more now.”
She also suggested that the Wisconsin group may have a lower threshold for diagnosing the milder cortisol elevation seen with adrenal Cushing syndrome. “If you screen for Cushing with a dexamethasone suppression test … [i]f you have autonomous secretion by the adrenal, you don’t suppress as much. ... When you measure 24-hour urinary cortisol, it may be normal. So you’re in this in-between [state]. ... Maybe in Wisconsin they’re diagnosing it more. Or, maybe it’s just being underdiagnosed in other places.”
She also pointed out that “you can’t diagnose it unless you think of it. I’m not so sure that with these mild cases it’s so much that it’s more common, but maybe it’s like thyroid nodules, where we didn’t know about it until everybody started getting all of these CT scans. We’re now seeing all these incidental thyroid nodules ... I don’t think we’re missing florid Cushing.”
However, Dr. Wardlaw said, it’s probably worthwhile to detect even milder hypercortisolism because it could still have long-term damaging effects, including osteoporosis, muscle weakness, glucose intolerance, and frailty. “You could do something about it and normalize it if you found it. I think that would be the reason to do it.”
Is Wisconsin Representative of Cushing Everywhere?
Dr. Carroll presented the findings at the annual meeting of the Endocrine Society. He began by noting that most of the previous CS incidence studies, with estimates of 1.2-3.2 cases per million per year, come from European data published from 1994 to 2019 and collected as far back as 1955. The method of acquisition of patients and the definitions of confirmed cases varied widely in those studies, which reported CS etiologies of ACTH-secreting neoplasms (pituitary or ectopic) in 75%-85% and adrenal-dependent cortisol excess in 15%-20%.
The current study included data from clinic records between May 1, 2017, and December 31, 2022, of Wisconsin residents newly diagnosed with and treated for CS. The CS diagnosis was established with standard guideline-supported biochemical testing and appropriate imaging. Patients with exogenous and non-neoplastic hypercortisolism and those who did not receive therapy for CS were excluded.
A total of 185 patients (73% female, 27% male) were identified from 27 of the total 72 counties in Wisconsin, representing a population of 4.5 million. On the basis of the total 5.9 million population of Wisconsin, the incidence of CS in the state works out to 7.2 cases per million population per year, Dr. Carroll said.
However, data from the Wisconsin Hospital Association show that the University of Wisconsin’s Milwaukee facility treated just about half of patients in the state who are discharged from the hospital with a diagnosis of CS during 2019-2023. “So ... that means that an actual or approximate incidence of 14-15 cases per million per year rather than the 7.2 cases that we produce,” he said.
Etiologies were 60% adrenal (111 patients), 36.8% pituitary (68 patients), and 3.2% ectopic (6 patients). Those proportions were similar between genders.
On biochemical testing, values for late-night salivary cortisol, dexamethasone suppression, and urinary free cortisol were highest for the ectopic group (3.189 µg/dL, 42.5 µg/dL, and 1514.2 µg/24 h, respectively) and lowest for the adrenal group (0.236 µg/dL, 6.5 µg/dL, and 64.2 µg/24 h, respectively). All differences between groups were highly statistically significant, at P < .0001, Dr. Carroll noted.
Classic physical features of CS were present in 91% of people with pituitary CS and 100% of those ectopic CS but just 44% of individuals with adrenal CS. “We found that adrenal-dependent disease was the most common form of Cushing syndrome. It frequently presented without classic physical features that may be due to the milder biochemical presentation,” he concluded.
Dr. Carroll reported consulting and investigator fees from Corcept Therapeutics. Dr. Wardlaw has no disclosures.
A version of this article appeared on Medscape.com.
BOSTON — The prevalence of Cushing syndrome (CS) in the United States may be considerably higher than currently appreciated, new data from a single US institution suggest.
In contrast to estimates of 1 to 3 cases per million patient-years from population-based European studies, researchers at the University of Wisconsin, Milwaukee, estimated that the incidence of CS in Wisconsin is a minimum of 7.2 cases per million patient-years. What’s more, contrary to all previous studies, they found that adrenal Cushing syndrome was more common than pituitary adrenocorticotropic hormone (ACTH)–secreting tumors (Cushing disease), and that fewer than half of individuals with adrenal Cushing syndrome had classic physical features of hypercortisolism, such as weight gain, round face, excessive hair growth, and stretch marks.
“Cases are absolutely being missed. ... Clinicians should realize that cortisol excess is not rare. It may not be common, but it needs to be considered in patients with any constellation of features that are seen in cortisol excess,” study investigator Ty B. Carroll, MD, associate professor of medicine, endocrinology and molecular medicine, and the endocrine fellowship program director at Medical College of Wisconsin in Milwaukee, told this news organization.
There are several contributing factors, he noted, “including the obesity and diabetes epidemics which make some clinical features of cortisol excess more common and less notable. Providers get used to seeing patients with some features of cortisol excess and don’t think to screen. The consequence of this is more difficult-to-control diabetes and hypertension, more advance metabolic bone disease, and likely more advanced cardiovascular disease, all resulting from extended exposure to cortisol excess,” he said.
Are Milder Cases the Ones Being Missed?
Asked to comment, session moderator Sharon L. Wardlaw, MD, professor of medicine at Columbia University College of Physicians and Surgeons, New York City, said, “When we talk about Cushing [syndrome], we usually think of pituitary ACTH as more [common], followed by adrenal adenomas, and then ectopic. But they’re seeing more adrenal adenoma ... we are probably diagnosing this a little more now.”
She also suggested that the Wisconsin group may have a lower threshold for diagnosing the milder cortisol elevation seen with adrenal Cushing syndrome. “If you screen for Cushing with a dexamethasone suppression test … [i]f you have autonomous secretion by the adrenal, you don’t suppress as much. ... When you measure 24-hour urinary cortisol, it may be normal. So you’re in this in-between [state]. ... Maybe in Wisconsin they’re diagnosing it more. Or, maybe it’s just being underdiagnosed in other places.”
She also pointed out that “you can’t diagnose it unless you think of it. I’m not so sure that with these mild cases it’s so much that it’s more common, but maybe it’s like thyroid nodules, where we didn’t know about it until everybody started getting all of these CT scans. We’re now seeing all these incidental thyroid nodules ... I don’t think we’re missing florid Cushing.”
However, Dr. Wardlaw said, it’s probably worthwhile to detect even milder hypercortisolism because it could still have long-term damaging effects, including osteoporosis, muscle weakness, glucose intolerance, and frailty. “You could do something about it and normalize it if you found it. I think that would be the reason to do it.”
Is Wisconsin Representative of Cushing Everywhere?
Dr. Carroll presented the findings at the annual meeting of the Endocrine Society. He began by noting that most of the previous CS incidence studies, with estimates of 1.2-3.2 cases per million per year, come from European data published from 1994 to 2019 and collected as far back as 1955. The method of acquisition of patients and the definitions of confirmed cases varied widely in those studies, which reported CS etiologies of ACTH-secreting neoplasms (pituitary or ectopic) in 75%-85% and adrenal-dependent cortisol excess in 15%-20%.
The current study included data from clinic records between May 1, 2017, and December 31, 2022, of Wisconsin residents newly diagnosed with and treated for CS. The CS diagnosis was established with standard guideline-supported biochemical testing and appropriate imaging. Patients with exogenous and non-neoplastic hypercortisolism and those who did not receive therapy for CS were excluded.
A total of 185 patients (73% female, 27% male) were identified from 27 of the total 72 counties in Wisconsin, representing a population of 4.5 million. On the basis of the total 5.9 million population of Wisconsin, the incidence of CS in the state works out to 7.2 cases per million population per year, Dr. Carroll said.
However, data from the Wisconsin Hospital Association show that the University of Wisconsin’s Milwaukee facility treated just about half of patients in the state who are discharged from the hospital with a diagnosis of CS during 2019-2023. “So ... that means that an actual or approximate incidence of 14-15 cases per million per year rather than the 7.2 cases that we produce,” he said.
Etiologies were 60% adrenal (111 patients), 36.8% pituitary (68 patients), and 3.2% ectopic (6 patients). Those proportions were similar between genders.
On biochemical testing, values for late-night salivary cortisol, dexamethasone suppression, and urinary free cortisol were highest for the ectopic group (3.189 µg/dL, 42.5 µg/dL, and 1514.2 µg/24 h, respectively) and lowest for the adrenal group (0.236 µg/dL, 6.5 µg/dL, and 64.2 µg/24 h, respectively). All differences between groups were highly statistically significant, at P < .0001, Dr. Carroll noted.
Classic physical features of CS were present in 91% of people with pituitary CS and 100% of those ectopic CS but just 44% of individuals with adrenal CS. “We found that adrenal-dependent disease was the most common form of Cushing syndrome. It frequently presented without classic physical features that may be due to the milder biochemical presentation,” he concluded.
Dr. Carroll reported consulting and investigator fees from Corcept Therapeutics. Dr. Wardlaw has no disclosures.
A version of this article appeared on Medscape.com.
New Blood Test for Large Vessel Stroke Could Be a ‘Game Changer’
When combined with clinical scores, a “game-changing” blood test can expedite the diagnosis and treatment of large vessel occlusion (LVO) stroke, potentially saving many lives, new data suggested.
Using cutoff levels of two blood biomarkers, glial fibrillary acidic protein (GFAP; 213 pg/mL) and D-dimer (600 ng/mL), and the field assessment stroke triage for emergency destination (FAST-ED) (score, > 2), investigators were able to detect LVOs with 81% sensitivity and 93% specificity less than 6 hours from the onset of symptoms.
GFAP has previously been linked to brain bleeds and traumatic brain injury.
The test also ruled out all patients with brain bleeds, and investigators noted that it could also be used to detect intracerebral hemorrhage.
“We have developed a game-changing, accessible tool that could help ensure that more people suffering from stroke are in the right place at the right time to receive critical, life-restoring care,” senior author Joshua Bernstock, MD, PhD, MPH, a clinical fellow in the department of neurosurgery at Brigham and Women’s Hospital in Boston, said in a press release.
The findings were published online on May 17 in Stroke: Vascular and Interventional Neurology.
Early Identification Crucial
Acute LVO stroke is one of the most treatable stroke types because of the availability of endovascular thrombectomy (EVT). However, EVT requires specialized equipment and teams that represent a small subset of accredited stroke centers and an even smaller subset of emergency medical facilities, so early identification of LVO is crucial, the investigators noted.
Dr. Bernstock and his team developed the TIME trial to assess the sensitivity and specificity of the blood biomarkers and scale cutoff values for identifying LVO vs non-LVO stroke.
As part of the observational prospective cohort trial, investigators included consecutive patients admitted to the Brandon Regional Hospital Emergency Department in Brandon, Florida, between May 2021 and August 2022 if they were referred for a suspected stroke and the time from symptom onset was under 18 hours.
Patients were excluded if they received thrombolytic therapy before blood was collected or if it was anticipated that blood collection would be difficult.
Investigators gathered information on patients’ clinical data, hematology results, time since last known well, and imaging findings to construct a clinical diagnosis (LVO, non-LVO, ischemic stroke, hemorrhagic stroke, or transient ischemic attack [TIA]).
In addition to the National Institutes of Health Stroke Scale, patients were assessed with the FAST-ED, the Rapid Arterial oCclusion Evaluation (RACE), the Cincinnati Stroke Triage Assessment Tool, and the Emergency Medical Stroke Assessment.
Of 323 patients in the final study sample, 29 (9%) had LVO ischemic stroke, and 48 (15%) had non-LVO ischemic stroke. Another 13 (4%) had hemorrhagic stroke, 12 had TIA (3.7%), and the largest proportion of patients had stroke mimic (n = 220; 68%), which included encephalopathy, hyperglycemia, hypertensive emergency, migraine, posterior reversible encephalopathy syndrome, and undetermined.
The Case for Biomarkers
When investigators looked at those with LVO ischemic stroke, they found the concentration of plasma D-dimer was significantly higher than that in patients with non-LVO suspected stroke (LVO suspected stroke, 1213 ng/mL; interquartile range [IQR], 733-1609 vs non-LVO suspected stroke, 617 ng/mL; IQR, 377-1345; P < .001).
In addition, GFAP was significantly increased in the plasma of patients with hemorrhagic stroke vs all other patients with suspected stroke (hemorrhagic stroke, 1464 pg/mL; IQR, 292-2580 vs nonhemorrhagic suspected stroke, 48 pg/mL; IQR, 12-98; P < .005).
Combinations of the blood biomarkers with the scales FAST-ED or RACE showed the best performance for LVO detection, with a specificity of 94% (for either scale combination) and a sensitivity of 71% for both scales.
When investigators analyzed data for just those patients identified within 6 hours of symptom onset, the combination of biomarkers plus FAST-ED resulted in a specificity of 93% and a sensitivity of 81%.
Given that clinical stroke scales in patients with hemorrhagic stroke frequently suggest LVO and that these patients are not candidates for EVT, a tool capable of ruling out hemorrhage and identifying only nonhemorrhagic ischemic LVO is essential, the investigators noted.
“In stroke care, time is brain,” Dr. Bernstock said. “The sooner a patient is put on the right care pathway, the better they are going to do. Whether that means ruling out bleeds or ruling in something that needs an intervention, being able to do this in a prehospital setting with the technology that we built is going to be truly transformative.”
The study was funded by the Innovate UK grant and private funding. Dr. Bernstock has positions and equity in Pockit Diagnostics Ltd. and Treovir Inc. and is on the boards of Centile Bio and NeuroX1. Other disclosures are noted in the original article.
A version of this article appeared on Medscape.com.
When combined with clinical scores, a “game-changing” blood test can expedite the diagnosis and treatment of large vessel occlusion (LVO) stroke, potentially saving many lives, new data suggested.
Using cutoff levels of two blood biomarkers, glial fibrillary acidic protein (GFAP; 213 pg/mL) and D-dimer (600 ng/mL), and the field assessment stroke triage for emergency destination (FAST-ED) (score, > 2), investigators were able to detect LVOs with 81% sensitivity and 93% specificity less than 6 hours from the onset of symptoms.
GFAP has previously been linked to brain bleeds and traumatic brain injury.
The test also ruled out all patients with brain bleeds, and investigators noted that it could also be used to detect intracerebral hemorrhage.
“We have developed a game-changing, accessible tool that could help ensure that more people suffering from stroke are in the right place at the right time to receive critical, life-restoring care,” senior author Joshua Bernstock, MD, PhD, MPH, a clinical fellow in the department of neurosurgery at Brigham and Women’s Hospital in Boston, said in a press release.
The findings were published online on May 17 in Stroke: Vascular and Interventional Neurology.
Early Identification Crucial
Acute LVO stroke is one of the most treatable stroke types because of the availability of endovascular thrombectomy (EVT). However, EVT requires specialized equipment and teams that represent a small subset of accredited stroke centers and an even smaller subset of emergency medical facilities, so early identification of LVO is crucial, the investigators noted.
Dr. Bernstock and his team developed the TIME trial to assess the sensitivity and specificity of the blood biomarkers and scale cutoff values for identifying LVO vs non-LVO stroke.
As part of the observational prospective cohort trial, investigators included consecutive patients admitted to the Brandon Regional Hospital Emergency Department in Brandon, Florida, between May 2021 and August 2022 if they were referred for a suspected stroke and the time from symptom onset was under 18 hours.
Patients were excluded if they received thrombolytic therapy before blood was collected or if it was anticipated that blood collection would be difficult.
Investigators gathered information on patients’ clinical data, hematology results, time since last known well, and imaging findings to construct a clinical diagnosis (LVO, non-LVO, ischemic stroke, hemorrhagic stroke, or transient ischemic attack [TIA]).
In addition to the National Institutes of Health Stroke Scale, patients were assessed with the FAST-ED, the Rapid Arterial oCclusion Evaluation (RACE), the Cincinnati Stroke Triage Assessment Tool, and the Emergency Medical Stroke Assessment.
Of 323 patients in the final study sample, 29 (9%) had LVO ischemic stroke, and 48 (15%) had non-LVO ischemic stroke. Another 13 (4%) had hemorrhagic stroke, 12 had TIA (3.7%), and the largest proportion of patients had stroke mimic (n = 220; 68%), which included encephalopathy, hyperglycemia, hypertensive emergency, migraine, posterior reversible encephalopathy syndrome, and undetermined.
The Case for Biomarkers
When investigators looked at those with LVO ischemic stroke, they found the concentration of plasma D-dimer was significantly higher than that in patients with non-LVO suspected stroke (LVO suspected stroke, 1213 ng/mL; interquartile range [IQR], 733-1609 vs non-LVO suspected stroke, 617 ng/mL; IQR, 377-1345; P < .001).
In addition, GFAP was significantly increased in the plasma of patients with hemorrhagic stroke vs all other patients with suspected stroke (hemorrhagic stroke, 1464 pg/mL; IQR, 292-2580 vs nonhemorrhagic suspected stroke, 48 pg/mL; IQR, 12-98; P < .005).
Combinations of the blood biomarkers with the scales FAST-ED or RACE showed the best performance for LVO detection, with a specificity of 94% (for either scale combination) and a sensitivity of 71% for both scales.
When investigators analyzed data for just those patients identified within 6 hours of symptom onset, the combination of biomarkers plus FAST-ED resulted in a specificity of 93% and a sensitivity of 81%.
Given that clinical stroke scales in patients with hemorrhagic stroke frequently suggest LVO and that these patients are not candidates for EVT, a tool capable of ruling out hemorrhage and identifying only nonhemorrhagic ischemic LVO is essential, the investigators noted.
“In stroke care, time is brain,” Dr. Bernstock said. “The sooner a patient is put on the right care pathway, the better they are going to do. Whether that means ruling out bleeds or ruling in something that needs an intervention, being able to do this in a prehospital setting with the technology that we built is going to be truly transformative.”
The study was funded by the Innovate UK grant and private funding. Dr. Bernstock has positions and equity in Pockit Diagnostics Ltd. and Treovir Inc. and is on the boards of Centile Bio and NeuroX1. Other disclosures are noted in the original article.
A version of this article appeared on Medscape.com.
When combined with clinical scores, a “game-changing” blood test can expedite the diagnosis and treatment of large vessel occlusion (LVO) stroke, potentially saving many lives, new data suggested.
Using cutoff levels of two blood biomarkers, glial fibrillary acidic protein (GFAP; 213 pg/mL) and D-dimer (600 ng/mL), and the field assessment stroke triage for emergency destination (FAST-ED) (score, > 2), investigators were able to detect LVOs with 81% sensitivity and 93% specificity less than 6 hours from the onset of symptoms.
GFAP has previously been linked to brain bleeds and traumatic brain injury.
The test also ruled out all patients with brain bleeds, and investigators noted that it could also be used to detect intracerebral hemorrhage.
“We have developed a game-changing, accessible tool that could help ensure that more people suffering from stroke are in the right place at the right time to receive critical, life-restoring care,” senior author Joshua Bernstock, MD, PhD, MPH, a clinical fellow in the department of neurosurgery at Brigham and Women’s Hospital in Boston, said in a press release.
The findings were published online on May 17 in Stroke: Vascular and Interventional Neurology.
Early Identification Crucial
Acute LVO stroke is one of the most treatable stroke types because of the availability of endovascular thrombectomy (EVT). However, EVT requires specialized equipment and teams that represent a small subset of accredited stroke centers and an even smaller subset of emergency medical facilities, so early identification of LVO is crucial, the investigators noted.
Dr. Bernstock and his team developed the TIME trial to assess the sensitivity and specificity of the blood biomarkers and scale cutoff values for identifying LVO vs non-LVO stroke.
As part of the observational prospective cohort trial, investigators included consecutive patients admitted to the Brandon Regional Hospital Emergency Department in Brandon, Florida, between May 2021 and August 2022 if they were referred for a suspected stroke and the time from symptom onset was under 18 hours.
Patients were excluded if they received thrombolytic therapy before blood was collected or if it was anticipated that blood collection would be difficult.
Investigators gathered information on patients’ clinical data, hematology results, time since last known well, and imaging findings to construct a clinical diagnosis (LVO, non-LVO, ischemic stroke, hemorrhagic stroke, or transient ischemic attack [TIA]).
In addition to the National Institutes of Health Stroke Scale, patients were assessed with the FAST-ED, the Rapid Arterial oCclusion Evaluation (RACE), the Cincinnati Stroke Triage Assessment Tool, and the Emergency Medical Stroke Assessment.
Of 323 patients in the final study sample, 29 (9%) had LVO ischemic stroke, and 48 (15%) had non-LVO ischemic stroke. Another 13 (4%) had hemorrhagic stroke, 12 had TIA (3.7%), and the largest proportion of patients had stroke mimic (n = 220; 68%), which included encephalopathy, hyperglycemia, hypertensive emergency, migraine, posterior reversible encephalopathy syndrome, and undetermined.
The Case for Biomarkers
When investigators looked at those with LVO ischemic stroke, they found the concentration of plasma D-dimer was significantly higher than that in patients with non-LVO suspected stroke (LVO suspected stroke, 1213 ng/mL; interquartile range [IQR], 733-1609 vs non-LVO suspected stroke, 617 ng/mL; IQR, 377-1345; P < .001).
In addition, GFAP was significantly increased in the plasma of patients with hemorrhagic stroke vs all other patients with suspected stroke (hemorrhagic stroke, 1464 pg/mL; IQR, 292-2580 vs nonhemorrhagic suspected stroke, 48 pg/mL; IQR, 12-98; P < .005).
Combinations of the blood biomarkers with the scales FAST-ED or RACE showed the best performance for LVO detection, with a specificity of 94% (for either scale combination) and a sensitivity of 71% for both scales.
When investigators analyzed data for just those patients identified within 6 hours of symptom onset, the combination of biomarkers plus FAST-ED resulted in a specificity of 93% and a sensitivity of 81%.
Given that clinical stroke scales in patients with hemorrhagic stroke frequently suggest LVO and that these patients are not candidates for EVT, a tool capable of ruling out hemorrhage and identifying only nonhemorrhagic ischemic LVO is essential, the investigators noted.
“In stroke care, time is brain,” Dr. Bernstock said. “The sooner a patient is put on the right care pathway, the better they are going to do. Whether that means ruling out bleeds or ruling in something that needs an intervention, being able to do this in a prehospital setting with the technology that we built is going to be truly transformative.”
The study was funded by the Innovate UK grant and private funding. Dr. Bernstock has positions and equity in Pockit Diagnostics Ltd. and Treovir Inc. and is on the boards of Centile Bio and NeuroX1. Other disclosures are noted in the original article.
A version of this article appeared on Medscape.com.
FROM STROKE: VASCULAR AND INTERVENTIONAL NEUROLOGY
Are Children Born Through ART at Higher Risk for Cancer?
The results of a large French study comparing the cancer risk in children conceived through assisted reproductive technology (ART) with that of naturally conceived children were published recently in JAMA Network Open. This study is one of the largest to date on this subject: It included 8,526,306 children born in France between 2010 and 2021, of whom 260,236 (3%) were conceived through ART, and followed them up to a median age of 6.7 years.
Motivations for the Study
ART (including artificial insemination, in vitro fertilization [IVF], or intracytoplasmic sperm injection [ICSI] with fresh or frozen embryo transfer) accounts for about 1 in 30 births in France. However, limited and heterogeneous data have suggested an increased risk for certain health disorders, including cancer, among children conceived through ART. Therefore, a large-scale evaluation of cancer risk in these children is important.
No Overall Increase
In all, 9256 children developed cancer, including 292 who were conceived through ART. Thus,
Nevertheless, a slight increase in the risk for leukemia was observed in children conceived through IVF or ICSI. The investigators observed approximately one additional case for every 5000 newborns conceived through IVF or ICSI who reached age 10 years.Epidemiological monitoring should be continued to better evaluate long-term risks and see whether the risk for leukemia is confirmed. If it is, then it will be useful to investigate the mechanisms related to ART techniques or the fertility disorders of parents that could lead to an increased risk for leukemia.
This story was translated from Univadis France, which is part of the Medscape Professional Network, using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
The results of a large French study comparing the cancer risk in children conceived through assisted reproductive technology (ART) with that of naturally conceived children were published recently in JAMA Network Open. This study is one of the largest to date on this subject: It included 8,526,306 children born in France between 2010 and 2021, of whom 260,236 (3%) were conceived through ART, and followed them up to a median age of 6.7 years.
Motivations for the Study
ART (including artificial insemination, in vitro fertilization [IVF], or intracytoplasmic sperm injection [ICSI] with fresh or frozen embryo transfer) accounts for about 1 in 30 births in France. However, limited and heterogeneous data have suggested an increased risk for certain health disorders, including cancer, among children conceived through ART. Therefore, a large-scale evaluation of cancer risk in these children is important.
No Overall Increase
In all, 9256 children developed cancer, including 292 who were conceived through ART. Thus,
Nevertheless, a slight increase in the risk for leukemia was observed in children conceived through IVF or ICSI. The investigators observed approximately one additional case for every 5000 newborns conceived through IVF or ICSI who reached age 10 years.Epidemiological monitoring should be continued to better evaluate long-term risks and see whether the risk for leukemia is confirmed. If it is, then it will be useful to investigate the mechanisms related to ART techniques or the fertility disorders of parents that could lead to an increased risk for leukemia.
This story was translated from Univadis France, which is part of the Medscape Professional Network, using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
The results of a large French study comparing the cancer risk in children conceived through assisted reproductive technology (ART) with that of naturally conceived children were published recently in JAMA Network Open. This study is one of the largest to date on this subject: It included 8,526,306 children born in France between 2010 and 2021, of whom 260,236 (3%) were conceived through ART, and followed them up to a median age of 6.7 years.
Motivations for the Study
ART (including artificial insemination, in vitro fertilization [IVF], or intracytoplasmic sperm injection [ICSI] with fresh or frozen embryo transfer) accounts for about 1 in 30 births in France. However, limited and heterogeneous data have suggested an increased risk for certain health disorders, including cancer, among children conceived through ART. Therefore, a large-scale evaluation of cancer risk in these children is important.
No Overall Increase
In all, 9256 children developed cancer, including 292 who were conceived through ART. Thus,
Nevertheless, a slight increase in the risk for leukemia was observed in children conceived through IVF or ICSI. The investigators observed approximately one additional case for every 5000 newborns conceived through IVF or ICSI who reached age 10 years.Epidemiological monitoring should be continued to better evaluate long-term risks and see whether the risk for leukemia is confirmed. If it is, then it will be useful to investigate the mechanisms related to ART techniques or the fertility disorders of parents that could lead to an increased risk for leukemia.
This story was translated from Univadis France, which is part of the Medscape Professional Network, using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
About 20% of Breast Cancer Survivors Gain Excess Weight
BOSTON — Nearly one in five breast cancer survivors will gain more than 10% of their body weight in the 6 years following their diagnosis, according to new research presented at ENDO 2024, the annual meeting of the Endocrine Society.
Younger age and lower weight at diagnosis were the strongest predictors of this excessive weight gain over time.
“Weight gain is a common concern after breast cancer diagnosis and treatment,” said Maria Daniela Hurtado Andrade, MD, PhD, of the Mayo Clinic in Jacksonville, Florida, who led the research. “This weight gain in breast cancer survivor increases breast cancer recurrence and mortality, increases cardiovascular disease and mortality, and also increases all-cause mortality.”
Previous studies have found an association between breast cancer survivorship and weight gain, but the reported incidences of weight gain — and the amounts gained — have been highly variable, she added.
In the study, researchers used the Mayo Clinic Breast Cancer Registry to identify 4575 breast cancer survivors and tracked their weight over the course of 6 years following cancer diagnosis. These patients were age-matched to women in the general population selected from the Rochester Epidemiology Project, which contains the medical records of residents of 27 counties in Minnesota and Wisconsin. All controls had no history of cancer or bariatric surgery.
Nearly all patients and controls were White (97%); at breast cancer diagnosis, patients were on average 58 years of age and weighed 76 kg (165.5 lb). Controls had similar ages and baseline weights.
At 6 years following breast cancer diagnosis, average weight gain was modest: Breast cancer survivors gained 1.6% of their body weight, compared with 0.7% in controls (P = .004).
However, 18% of breast cancer survivors had gained at least 10% of their body weight over that time. By comparison, 8% of controls experienced this excessive weight gain during that same time frame (P < .0001). The same trend was observed for 15% and 20% weight gain.
After adjustment for confounding factors, younger age at breast cancer diagnosis and lower baseline weight were the strongest predictors of more than 10% weight gain. BRCA2 mutation and use of systemic chemotherapy treatment were also associated with excessive weight gain.
Several factors could be driving weight gain in these patients, said Zeynep Madak-Erdogan, PhD, at the University of Illinois Urbana-Champaign, who was not involved with the research. Her work focuses on how diet and nutrition affect hormone action in postmenopausal women and breast cancer survivors. Certain therapies can induce temporary or permanent menopause in patients, “and this early menopause might shift balance of estrogens and cause increased weight gain,” she said. Along the same lines, endocrine therapies can also affect estrogen production.
Stress and exhaustion from treatment — especially compounded by the two previous factors — are also likely culprits in weight gain, she continued.
“These findings highlight importance of lifestyle interventions,” added Dr. Madak-Erdogan. “In addition to changes in the diet (increased vegetable, fruit, [and] whole grain intake; reduction in saturated fats, alcohol, [and] sweetened beverage consumption), survivors should be consulted on importance of regular exercise.”
“These data clearly show we must consider weight changes in breast cancer survivors, and we must find ways of instituting strategies to mitigate these weight gains,” Dr. Hurtado Andrade said. “These women have a lot to think of when they have a breast cancer diagnosis, so we also must find ways of instituting these measures in a way that doesn’t increase the burden of their health.”
Dr. Hurtado Andrade has received research funding from the National Institutes of Health and by Phenomix Sciences. She also is a consultant for Novo Nordisk. These three organizations were not involved with this study. Dr. Madak-Erdogan had no disclosures.
A version of this article first appeared on Medscape.com.
BOSTON — Nearly one in five breast cancer survivors will gain more than 10% of their body weight in the 6 years following their diagnosis, according to new research presented at ENDO 2024, the annual meeting of the Endocrine Society.
Younger age and lower weight at diagnosis were the strongest predictors of this excessive weight gain over time.
“Weight gain is a common concern after breast cancer diagnosis and treatment,” said Maria Daniela Hurtado Andrade, MD, PhD, of the Mayo Clinic in Jacksonville, Florida, who led the research. “This weight gain in breast cancer survivor increases breast cancer recurrence and mortality, increases cardiovascular disease and mortality, and also increases all-cause mortality.”
Previous studies have found an association between breast cancer survivorship and weight gain, but the reported incidences of weight gain — and the amounts gained — have been highly variable, she added.
In the study, researchers used the Mayo Clinic Breast Cancer Registry to identify 4575 breast cancer survivors and tracked their weight over the course of 6 years following cancer diagnosis. These patients were age-matched to women in the general population selected from the Rochester Epidemiology Project, which contains the medical records of residents of 27 counties in Minnesota and Wisconsin. All controls had no history of cancer or bariatric surgery.
Nearly all patients and controls were White (97%); at breast cancer diagnosis, patients were on average 58 years of age and weighed 76 kg (165.5 lb). Controls had similar ages and baseline weights.
At 6 years following breast cancer diagnosis, average weight gain was modest: Breast cancer survivors gained 1.6% of their body weight, compared with 0.7% in controls (P = .004).
However, 18% of breast cancer survivors had gained at least 10% of their body weight over that time. By comparison, 8% of controls experienced this excessive weight gain during that same time frame (P < .0001). The same trend was observed for 15% and 20% weight gain.
After adjustment for confounding factors, younger age at breast cancer diagnosis and lower baseline weight were the strongest predictors of more than 10% weight gain. BRCA2 mutation and use of systemic chemotherapy treatment were also associated with excessive weight gain.
Several factors could be driving weight gain in these patients, said Zeynep Madak-Erdogan, PhD, at the University of Illinois Urbana-Champaign, who was not involved with the research. Her work focuses on how diet and nutrition affect hormone action in postmenopausal women and breast cancer survivors. Certain therapies can induce temporary or permanent menopause in patients, “and this early menopause might shift balance of estrogens and cause increased weight gain,” she said. Along the same lines, endocrine therapies can also affect estrogen production.
Stress and exhaustion from treatment — especially compounded by the two previous factors — are also likely culprits in weight gain, she continued.
“These findings highlight importance of lifestyle interventions,” added Dr. Madak-Erdogan. “In addition to changes in the diet (increased vegetable, fruit, [and] whole grain intake; reduction in saturated fats, alcohol, [and] sweetened beverage consumption), survivors should be consulted on importance of regular exercise.”
“These data clearly show we must consider weight changes in breast cancer survivors, and we must find ways of instituting strategies to mitigate these weight gains,” Dr. Hurtado Andrade said. “These women have a lot to think of when they have a breast cancer diagnosis, so we also must find ways of instituting these measures in a way that doesn’t increase the burden of their health.”
Dr. Hurtado Andrade has received research funding from the National Institutes of Health and by Phenomix Sciences. She also is a consultant for Novo Nordisk. These three organizations were not involved with this study. Dr. Madak-Erdogan had no disclosures.
A version of this article first appeared on Medscape.com.
BOSTON — Nearly one in five breast cancer survivors will gain more than 10% of their body weight in the 6 years following their diagnosis, according to new research presented at ENDO 2024, the annual meeting of the Endocrine Society.
Younger age and lower weight at diagnosis were the strongest predictors of this excessive weight gain over time.
“Weight gain is a common concern after breast cancer diagnosis and treatment,” said Maria Daniela Hurtado Andrade, MD, PhD, of the Mayo Clinic in Jacksonville, Florida, who led the research. “This weight gain in breast cancer survivor increases breast cancer recurrence and mortality, increases cardiovascular disease and mortality, and also increases all-cause mortality.”
Previous studies have found an association between breast cancer survivorship and weight gain, but the reported incidences of weight gain — and the amounts gained — have been highly variable, she added.
In the study, researchers used the Mayo Clinic Breast Cancer Registry to identify 4575 breast cancer survivors and tracked their weight over the course of 6 years following cancer diagnosis. These patients were age-matched to women in the general population selected from the Rochester Epidemiology Project, which contains the medical records of residents of 27 counties in Minnesota and Wisconsin. All controls had no history of cancer or bariatric surgery.
Nearly all patients and controls were White (97%); at breast cancer diagnosis, patients were on average 58 years of age and weighed 76 kg (165.5 lb). Controls had similar ages and baseline weights.
At 6 years following breast cancer diagnosis, average weight gain was modest: Breast cancer survivors gained 1.6% of their body weight, compared with 0.7% in controls (P = .004).
However, 18% of breast cancer survivors had gained at least 10% of their body weight over that time. By comparison, 8% of controls experienced this excessive weight gain during that same time frame (P < .0001). The same trend was observed for 15% and 20% weight gain.
After adjustment for confounding factors, younger age at breast cancer diagnosis and lower baseline weight were the strongest predictors of more than 10% weight gain. BRCA2 mutation and use of systemic chemotherapy treatment were also associated with excessive weight gain.
Several factors could be driving weight gain in these patients, said Zeynep Madak-Erdogan, PhD, at the University of Illinois Urbana-Champaign, who was not involved with the research. Her work focuses on how diet and nutrition affect hormone action in postmenopausal women and breast cancer survivors. Certain therapies can induce temporary or permanent menopause in patients, “and this early menopause might shift balance of estrogens and cause increased weight gain,” she said. Along the same lines, endocrine therapies can also affect estrogen production.
Stress and exhaustion from treatment — especially compounded by the two previous factors — are also likely culprits in weight gain, she continued.
“These findings highlight importance of lifestyle interventions,” added Dr. Madak-Erdogan. “In addition to changes in the diet (increased vegetable, fruit, [and] whole grain intake; reduction in saturated fats, alcohol, [and] sweetened beverage consumption), survivors should be consulted on importance of regular exercise.”
“These data clearly show we must consider weight changes in breast cancer survivors, and we must find ways of instituting strategies to mitigate these weight gains,” Dr. Hurtado Andrade said. “These women have a lot to think of when they have a breast cancer diagnosis, so we also must find ways of instituting these measures in a way that doesn’t increase the burden of their health.”
Dr. Hurtado Andrade has received research funding from the National Institutes of Health and by Phenomix Sciences. She also is a consultant for Novo Nordisk. These three organizations were not involved with this study. Dr. Madak-Erdogan had no disclosures.
A version of this article first appeared on Medscape.com.
FROM ENDO 2024
T-DXd Moves Toward First Line for HER2-Low Metastatic BC
HER2-low cancers express levels of human epidermal growth factor receptor 2 that are below standard thresholds for HER2-positive immunohistochemistry. In 2022, results from the DESTINY-Breast04 trial showed T-DXd (Enhertu, AstraZeneca) to be an effective second-line chemotherapy in patients with HER2-low metastatic breast cancer.
The highly awaited new findings, from the manufacturer-sponsored, open-label Phase 3 DESTINY-Breast06 trial, were presented at the annual meeting of the American Society of Clinical Oncology (ASCO) in Chicago, Illinois.
The findings not only definitively establish a role for T-DXd earlier in the treatment sequence for HER2-low cancers, they also suggest benefit in a group of patients designated for the purposes of this trial to be HER2-ultralow. These patients have cancers with only faintly detectable HER2 expression on currently used assays (J Clin Oncol 42, 2024 [suppl 17; abstr LBA 1000]).
In a separate set of findings also presented at ASCO, from the randomized phase 1B open-label study, DESTINY-Breast07, T-Dxd showed efficacy in previously untreated HER2-positive metastatic breast cancer patients both alone and in combination with the monoclonal antibody pertuzumab (Perjeta, Genentech).
DESTINY-Breast06 Methods and Results
The DESTINY-Breast06 findings were presented by lead investigator Guiseppe Curigliano, MD, PhD, of the University of Milan and European Institute of Oncology. Dr. Curigliano and his colleagues randomized 866 patients with metastatic breast cancer: 436 to intravenous T-Dxd and 430 to the investigator’s choice of capecitabine, nab-paclitaxel, or paclitaxel chemotherapy. The investigators chose capecitabine 60% of the time.
Most patients had cancers classed as HER2 low (immunohistochemistry 1+ or 2+), while 153 had cancers classed by investigators as HER2-ultralow (IHC 0 with membrane staining or IHC under 1+). Patients enrolled in the study were those whose disease had progressed after endocrine therapy with or without targeted therapy. Patients’ median age was between 57 and 58, and all were chemotherapy-naive in the metastatic breast cancer setting.
The main outcome of the study was median progression-free survival in the HER2-low group. T-Dxd was seen improving progression-free survival, with median 13.2 months vs. 8.1 months (hazard ratio, 0.62; 95% confidence interval, 0.51-0.74; P < .0001). In the intention-to-treat population, which included the HER2 ultralow patients, the benefit was the same (HR, 0.63; 95% CI, 0.53-0.75; P < .0001). This suggested that T-DXd is also effective in these patients, and it will be extremely important going forward to identify the lowest level of HER2 expression in metastatic breast cancers that can still benefit from therapy with T-DxD, Dr. Curigliano said.
Overall survival could not be assessed in the study cohort because complete data were not yet available, Dr. Curigliano said. However, trends pointed to an advantage for T-DXd, and tumor response rates were markedly higher with T-DXd: 57% compared with 31% for standard chemotherapy in the full cohort.
Serious treatment-emergent adverse events were more common in the T-Dxd–treated patients, with 11% of that arm developing drug-related interstitial lung disease, and three patients dying of it. Five patients in the T-DXd arm died of adverse events deemed treatment-related, and none died from treatment-related adverse events in the standard chemotherapy arm. Altogether 11 patients died in the T-DXd arm and 6 in the chemotherapy arm.
Clinical Implications of DESTINY-Breast06
The DESTINY-Breast06 data show that “we have to again change how we think about HER2 expression. Even very low levels of HER2 expression matter, and they can be leveraged to improve the treatment for our patients,” said Ian Krop, MD, PhD, of the Yale Cancer Center in New Haven, Connecticut, during the session where the results were presented.
But T-DXd may not be an appropriate first choice for all patients, especially given the safety concerns associated with T-DXd, he continued. With overall survival and quality-of-life data still lacking, clinicians will have to determine on a case-by-case basis who should get T-DXd in the first line.
“For patients who have symptomatic metastatic disease, who need a response to address those symptoms, those in whom you think chemotherapy may not work as well because they had, for example, a short recurrence interval after their adjuvant chemotherapy — using T-DXd in that first-line setting makes perfect sense to take advantage of the substantially higher response rate compared to chemo,” Dr. Krop said. “But for patients who have asymptomatic low burdens of disease, it seems very reasonable to consider using a well-tolerated chemotherapy like capecitabine in the first line, and then using T-DXd in the second line.”
In an interview, Erica Mayer, MD, of the Dana Farber Cancer Institute in Boston, Massachusetts, said patient choice will also matter in determining whether T-DXd is a first-line option. The known toxicity of T-DXd was underscored by the latest findings, she noted, while capecitabine, one of the chemotherapy choices in the control arm of the study, “really reflects what the majority of breast cancer doctors tend to offer, both because of the efficacy of the drug, but also because it’s oral, it’s well tolerated, and you don’t lose your hair.”
DESTINY-Breast07 Results
The DESTINY-Breast07 findings, from a Phase 1B open-label trial measuring safety and tolerability, were presented by Fabrice Andre, MD, PhD, of Université Paris Saclay in Paris, France. Dr. Andre and his colleagues presented the first data comparing T-DXd monotherapy and T-DXd with pertuzumab — a monoclonal antibody targeting HER2 — as a first-line treatment in patients with HER2-overexpressing (immunohistochemistry 3 and above) metastatic breast cancer. (J Clin Oncol 42, 2024 [suppl 16; abstr 1009]).
Current first-line standard of care for these patients is pertuzumab, trastuzumab, and docetaxel, based on results from the 2015 CLEOPATRA trial. T-DXd is currently approved as a second-line treatment.
Dr. Andre and his colleagues randomized 75 patients to monotherapy with T-DXd and 50 to combined therapy, with a median follow-up of 2 years.
After 1 year of treatment, combination of T-DXd and pertuzumab was seen to be associated with a progression-free survival of 89% at 1 year (80% CI, 81.9-93.9), compared with 80% in patients treated with T-DXd alone (80% CI, 73.7-86.1). Objective tumor response rate was 84% for the combined therapy at 12 weeks, with 20% of patients seeing a complete response, compared with 76% and 8%, respectively, for monotherapy.
As in the DESTINY-Breast06 trial, adverse events were high, with interstitial lung disease seen in 9% of patients in the monotherapy group and in 14% of the combined-therapy patients, although no treatment-related deaths occurred.
A randomized phase 3 trial, DESTINY Breast09, will now compare the monotherapy and the combined therapy with standard care.
T-DXd has seen a rapidly expanding role in treating breast and other solid tumors. The DESTINY Breast06 findings will move up its place in the treatment algorithm for metastatic breast cancer, “allowing us to now offer T-DXd as the first chemotherapy choice for patients who are making that transition to chemotherapy over many of the traditional provider choices that we previously have offered,” Dr. Mayer said.
The results “support the use of not only this specific agent, but also the concept of antibody drug conjugates as a very effective way to treat malignancy,” she added.
Dr. Curigliano reported receiving speaker’s fees, research funding, and other support from AstraZeneca and Daiichi Sankyo, among other companies, as did most of his co-authors, of whom three were AstraZeneca employees. Dr. Fabrice disclosed receiving research funding, travel compensation, and/or advisory fees from AstraZeneca and other entities, as did several of his co-authors. Two of his co-authors were employed by AstraZeneca and Roche, manufacturers of the study drugs. Dr. Krop and Dr. Mayer disclosed relationships with AstraZeneca and others.
HER2-low cancers express levels of human epidermal growth factor receptor 2 that are below standard thresholds for HER2-positive immunohistochemistry. In 2022, results from the DESTINY-Breast04 trial showed T-DXd (Enhertu, AstraZeneca) to be an effective second-line chemotherapy in patients with HER2-low metastatic breast cancer.
The highly awaited new findings, from the manufacturer-sponsored, open-label Phase 3 DESTINY-Breast06 trial, were presented at the annual meeting of the American Society of Clinical Oncology (ASCO) in Chicago, Illinois.
The findings not only definitively establish a role for T-DXd earlier in the treatment sequence for HER2-low cancers, they also suggest benefit in a group of patients designated for the purposes of this trial to be HER2-ultralow. These patients have cancers with only faintly detectable HER2 expression on currently used assays (J Clin Oncol 42, 2024 [suppl 17; abstr LBA 1000]).
In a separate set of findings also presented at ASCO, from the randomized phase 1B open-label study, DESTINY-Breast07, T-Dxd showed efficacy in previously untreated HER2-positive metastatic breast cancer patients both alone and in combination with the monoclonal antibody pertuzumab (Perjeta, Genentech).
DESTINY-Breast06 Methods and Results
The DESTINY-Breast06 findings were presented by lead investigator Guiseppe Curigliano, MD, PhD, of the University of Milan and European Institute of Oncology. Dr. Curigliano and his colleagues randomized 866 patients with metastatic breast cancer: 436 to intravenous T-Dxd and 430 to the investigator’s choice of capecitabine, nab-paclitaxel, or paclitaxel chemotherapy. The investigators chose capecitabine 60% of the time.
Most patients had cancers classed as HER2 low (immunohistochemistry 1+ or 2+), while 153 had cancers classed by investigators as HER2-ultralow (IHC 0 with membrane staining or IHC under 1+). Patients enrolled in the study were those whose disease had progressed after endocrine therapy with or without targeted therapy. Patients’ median age was between 57 and 58, and all were chemotherapy-naive in the metastatic breast cancer setting.
The main outcome of the study was median progression-free survival in the HER2-low group. T-Dxd was seen improving progression-free survival, with median 13.2 months vs. 8.1 months (hazard ratio, 0.62; 95% confidence interval, 0.51-0.74; P < .0001). In the intention-to-treat population, which included the HER2 ultralow patients, the benefit was the same (HR, 0.63; 95% CI, 0.53-0.75; P < .0001). This suggested that T-DXd is also effective in these patients, and it will be extremely important going forward to identify the lowest level of HER2 expression in metastatic breast cancers that can still benefit from therapy with T-DxD, Dr. Curigliano said.
Overall survival could not be assessed in the study cohort because complete data were not yet available, Dr. Curigliano said. However, trends pointed to an advantage for T-DXd, and tumor response rates were markedly higher with T-DXd: 57% compared with 31% for standard chemotherapy in the full cohort.
Serious treatment-emergent adverse events were more common in the T-Dxd–treated patients, with 11% of that arm developing drug-related interstitial lung disease, and three patients dying of it. Five patients in the T-DXd arm died of adverse events deemed treatment-related, and none died from treatment-related adverse events in the standard chemotherapy arm. Altogether 11 patients died in the T-DXd arm and 6 in the chemotherapy arm.
Clinical Implications of DESTINY-Breast06
The DESTINY-Breast06 data show that “we have to again change how we think about HER2 expression. Even very low levels of HER2 expression matter, and they can be leveraged to improve the treatment for our patients,” said Ian Krop, MD, PhD, of the Yale Cancer Center in New Haven, Connecticut, during the session where the results were presented.
But T-DXd may not be an appropriate first choice for all patients, especially given the safety concerns associated with T-DXd, he continued. With overall survival and quality-of-life data still lacking, clinicians will have to determine on a case-by-case basis who should get T-DXd in the first line.
“For patients who have symptomatic metastatic disease, who need a response to address those symptoms, those in whom you think chemotherapy may not work as well because they had, for example, a short recurrence interval after their adjuvant chemotherapy — using T-DXd in that first-line setting makes perfect sense to take advantage of the substantially higher response rate compared to chemo,” Dr. Krop said. “But for patients who have asymptomatic low burdens of disease, it seems very reasonable to consider using a well-tolerated chemotherapy like capecitabine in the first line, and then using T-DXd in the second line.”
In an interview, Erica Mayer, MD, of the Dana Farber Cancer Institute in Boston, Massachusetts, said patient choice will also matter in determining whether T-DXd is a first-line option. The known toxicity of T-DXd was underscored by the latest findings, she noted, while capecitabine, one of the chemotherapy choices in the control arm of the study, “really reflects what the majority of breast cancer doctors tend to offer, both because of the efficacy of the drug, but also because it’s oral, it’s well tolerated, and you don’t lose your hair.”
DESTINY-Breast07 Results
The DESTINY-Breast07 findings, from a Phase 1B open-label trial measuring safety and tolerability, were presented by Fabrice Andre, MD, PhD, of Université Paris Saclay in Paris, France. Dr. Andre and his colleagues presented the first data comparing T-DXd monotherapy and T-DXd with pertuzumab — a monoclonal antibody targeting HER2 — as a first-line treatment in patients with HER2-overexpressing (immunohistochemistry 3 and above) metastatic breast cancer. (J Clin Oncol 42, 2024 [suppl 16; abstr 1009]).
Current first-line standard of care for these patients is pertuzumab, trastuzumab, and docetaxel, based on results from the 2015 CLEOPATRA trial. T-DXd is currently approved as a second-line treatment.
Dr. Andre and his colleagues randomized 75 patients to monotherapy with T-DXd and 50 to combined therapy, with a median follow-up of 2 years.
After 1 year of treatment, combination of T-DXd and pertuzumab was seen to be associated with a progression-free survival of 89% at 1 year (80% CI, 81.9-93.9), compared with 80% in patients treated with T-DXd alone (80% CI, 73.7-86.1). Objective tumor response rate was 84% for the combined therapy at 12 weeks, with 20% of patients seeing a complete response, compared with 76% and 8%, respectively, for monotherapy.
As in the DESTINY-Breast06 trial, adverse events were high, with interstitial lung disease seen in 9% of patients in the monotherapy group and in 14% of the combined-therapy patients, although no treatment-related deaths occurred.
A randomized phase 3 trial, DESTINY Breast09, will now compare the monotherapy and the combined therapy with standard care.
T-DXd has seen a rapidly expanding role in treating breast and other solid tumors. The DESTINY Breast06 findings will move up its place in the treatment algorithm for metastatic breast cancer, “allowing us to now offer T-DXd as the first chemotherapy choice for patients who are making that transition to chemotherapy over many of the traditional provider choices that we previously have offered,” Dr. Mayer said.
The results “support the use of not only this specific agent, but also the concept of antibody drug conjugates as a very effective way to treat malignancy,” she added.
Dr. Curigliano reported receiving speaker’s fees, research funding, and other support from AstraZeneca and Daiichi Sankyo, among other companies, as did most of his co-authors, of whom three were AstraZeneca employees. Dr. Fabrice disclosed receiving research funding, travel compensation, and/or advisory fees from AstraZeneca and other entities, as did several of his co-authors. Two of his co-authors were employed by AstraZeneca and Roche, manufacturers of the study drugs. Dr. Krop and Dr. Mayer disclosed relationships with AstraZeneca and others.
HER2-low cancers express levels of human epidermal growth factor receptor 2 that are below standard thresholds for HER2-positive immunohistochemistry. In 2022, results from the DESTINY-Breast04 trial showed T-DXd (Enhertu, AstraZeneca) to be an effective second-line chemotherapy in patients with HER2-low metastatic breast cancer.
The highly awaited new findings, from the manufacturer-sponsored, open-label Phase 3 DESTINY-Breast06 trial, were presented at the annual meeting of the American Society of Clinical Oncology (ASCO) in Chicago, Illinois.
The findings not only definitively establish a role for T-DXd earlier in the treatment sequence for HER2-low cancers, they also suggest benefit in a group of patients designated for the purposes of this trial to be HER2-ultralow. These patients have cancers with only faintly detectable HER2 expression on currently used assays (J Clin Oncol 42, 2024 [suppl 17; abstr LBA 1000]).
In a separate set of findings also presented at ASCO, from the randomized phase 1B open-label study, DESTINY-Breast07, T-Dxd showed efficacy in previously untreated HER2-positive metastatic breast cancer patients both alone and in combination with the monoclonal antibody pertuzumab (Perjeta, Genentech).
DESTINY-Breast06 Methods and Results
The DESTINY-Breast06 findings were presented by lead investigator Guiseppe Curigliano, MD, PhD, of the University of Milan and European Institute of Oncology. Dr. Curigliano and his colleagues randomized 866 patients with metastatic breast cancer: 436 to intravenous T-Dxd and 430 to the investigator’s choice of capecitabine, nab-paclitaxel, or paclitaxel chemotherapy. The investigators chose capecitabine 60% of the time.
Most patients had cancers classed as HER2 low (immunohistochemistry 1+ or 2+), while 153 had cancers classed by investigators as HER2-ultralow (IHC 0 with membrane staining or IHC under 1+). Patients enrolled in the study were those whose disease had progressed after endocrine therapy with or without targeted therapy. Patients’ median age was between 57 and 58, and all were chemotherapy-naive in the metastatic breast cancer setting.
The main outcome of the study was median progression-free survival in the HER2-low group. T-Dxd was seen improving progression-free survival, with median 13.2 months vs. 8.1 months (hazard ratio, 0.62; 95% confidence interval, 0.51-0.74; P < .0001). In the intention-to-treat population, which included the HER2 ultralow patients, the benefit was the same (HR, 0.63; 95% CI, 0.53-0.75; P < .0001). This suggested that T-DXd is also effective in these patients, and it will be extremely important going forward to identify the lowest level of HER2 expression in metastatic breast cancers that can still benefit from therapy with T-DxD, Dr. Curigliano said.
Overall survival could not be assessed in the study cohort because complete data were not yet available, Dr. Curigliano said. However, trends pointed to an advantage for T-DXd, and tumor response rates were markedly higher with T-DXd: 57% compared with 31% for standard chemotherapy in the full cohort.
Serious treatment-emergent adverse events were more common in the T-Dxd–treated patients, with 11% of that arm developing drug-related interstitial lung disease, and three patients dying of it. Five patients in the T-DXd arm died of adverse events deemed treatment-related, and none died from treatment-related adverse events in the standard chemotherapy arm. Altogether 11 patients died in the T-DXd arm and 6 in the chemotherapy arm.
Clinical Implications of DESTINY-Breast06
The DESTINY-Breast06 data show that “we have to again change how we think about HER2 expression. Even very low levels of HER2 expression matter, and they can be leveraged to improve the treatment for our patients,” said Ian Krop, MD, PhD, of the Yale Cancer Center in New Haven, Connecticut, during the session where the results were presented.
But T-DXd may not be an appropriate first choice for all patients, especially given the safety concerns associated with T-DXd, he continued. With overall survival and quality-of-life data still lacking, clinicians will have to determine on a case-by-case basis who should get T-DXd in the first line.
“For patients who have symptomatic metastatic disease, who need a response to address those symptoms, those in whom you think chemotherapy may not work as well because they had, for example, a short recurrence interval after their adjuvant chemotherapy — using T-DXd in that first-line setting makes perfect sense to take advantage of the substantially higher response rate compared to chemo,” Dr. Krop said. “But for patients who have asymptomatic low burdens of disease, it seems very reasonable to consider using a well-tolerated chemotherapy like capecitabine in the first line, and then using T-DXd in the second line.”
In an interview, Erica Mayer, MD, of the Dana Farber Cancer Institute in Boston, Massachusetts, said patient choice will also matter in determining whether T-DXd is a first-line option. The known toxicity of T-DXd was underscored by the latest findings, she noted, while capecitabine, one of the chemotherapy choices in the control arm of the study, “really reflects what the majority of breast cancer doctors tend to offer, both because of the efficacy of the drug, but also because it’s oral, it’s well tolerated, and you don’t lose your hair.”
DESTINY-Breast07 Results
The DESTINY-Breast07 findings, from a Phase 1B open-label trial measuring safety and tolerability, were presented by Fabrice Andre, MD, PhD, of Université Paris Saclay in Paris, France. Dr. Andre and his colleagues presented the first data comparing T-DXd monotherapy and T-DXd with pertuzumab — a monoclonal antibody targeting HER2 — as a first-line treatment in patients with HER2-overexpressing (immunohistochemistry 3 and above) metastatic breast cancer. (J Clin Oncol 42, 2024 [suppl 16; abstr 1009]).
Current first-line standard of care for these patients is pertuzumab, trastuzumab, and docetaxel, based on results from the 2015 CLEOPATRA trial. T-DXd is currently approved as a second-line treatment.
Dr. Andre and his colleagues randomized 75 patients to monotherapy with T-DXd and 50 to combined therapy, with a median follow-up of 2 years.
After 1 year of treatment, combination of T-DXd and pertuzumab was seen to be associated with a progression-free survival of 89% at 1 year (80% CI, 81.9-93.9), compared with 80% in patients treated with T-DXd alone (80% CI, 73.7-86.1). Objective tumor response rate was 84% for the combined therapy at 12 weeks, with 20% of patients seeing a complete response, compared with 76% and 8%, respectively, for monotherapy.
As in the DESTINY-Breast06 trial, adverse events were high, with interstitial lung disease seen in 9% of patients in the monotherapy group and in 14% of the combined-therapy patients, although no treatment-related deaths occurred.
A randomized phase 3 trial, DESTINY Breast09, will now compare the monotherapy and the combined therapy with standard care.
T-DXd has seen a rapidly expanding role in treating breast and other solid tumors. The DESTINY Breast06 findings will move up its place in the treatment algorithm for metastatic breast cancer, “allowing us to now offer T-DXd as the first chemotherapy choice for patients who are making that transition to chemotherapy over many of the traditional provider choices that we previously have offered,” Dr. Mayer said.
The results “support the use of not only this specific agent, but also the concept of antibody drug conjugates as a very effective way to treat malignancy,” she added.
Dr. Curigliano reported receiving speaker’s fees, research funding, and other support from AstraZeneca and Daiichi Sankyo, among other companies, as did most of his co-authors, of whom three were AstraZeneca employees. Dr. Fabrice disclosed receiving research funding, travel compensation, and/or advisory fees from AstraZeneca and other entities, as did several of his co-authors. Two of his co-authors were employed by AstraZeneca and Roche, manufacturers of the study drugs. Dr. Krop and Dr. Mayer disclosed relationships with AstraZeneca and others.
FROM ASCO
PTSD Rates Soar Among College Students
TOPLINE:
Posttraumatic stress disorder (PTSD) rates among college students more than doubled between 2017 and 2022, new data showed. Rates of acute stress disorder (ASD) also increased during that time.
METHODOLOGY:
- Researchers conducted five waves of cross-sectional study from 2017 to 2022, involving 392,377 participants across 332 colleges and universities.
- The study utilized the Healthy Minds Study data, ensuring representativeness by applying sample weights based on institutional demographics.
- Outcome variables were diagnoses of PTSD and ASD, confirmed by healthcare practitioners, with statistical analysis assessing change in odds of estimated prevalence during 2017-2022.
TAKEAWAY:
- The prevalence of PTSD among US college students increased from 3.4% in 2017-2018 to 7.5% in 2021-2022.
- ASD diagnoses also rose from 0.2% in 2017-2018 to 0.7% in 2021-2022, with both increases remaining statistically significant after adjusting for demographic differences.
- Investigators noted that these findings underscore the need for targeted, trauma-informed intervention strategies in college settings.
IN PRACTICE:
“These trends highlight the escalating mental health challenges among college students, which is consistent with recent research reporting a surge in psychiatric diagnoses,” the authors wrote. “Factors contributing to this rise may include pandemic-related stressors (eg, loss of loved ones) and the effect of traumatic events (eg, campus shootings and racial trauma),” they added.
SOURCE:
The study was led by Yusen Zhai, PhD, University of Alabama at Birmingham. It was published online on May 30, 2024, in JAMA Network Open.
LIMITATIONS:
The study’s reliance on self-reported data and single questions for diagnosed PTSD and ASD may have limited the accuracy of the findings. The retrospective design and the absence of longitudinal follow-up may have restricted the ability to infer causality from the observed trends.
DISCLOSURES:
No disclosures were reported. No funding information was available.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
A version of this article appeared on Medscape.com.
TOPLINE:
Posttraumatic stress disorder (PTSD) rates among college students more than doubled between 2017 and 2022, new data showed. Rates of acute stress disorder (ASD) also increased during that time.
METHODOLOGY:
- Researchers conducted five waves of cross-sectional study from 2017 to 2022, involving 392,377 participants across 332 colleges and universities.
- The study utilized the Healthy Minds Study data, ensuring representativeness by applying sample weights based on institutional demographics.
- Outcome variables were diagnoses of PTSD and ASD, confirmed by healthcare practitioners, with statistical analysis assessing change in odds of estimated prevalence during 2017-2022.
TAKEAWAY:
- The prevalence of PTSD among US college students increased from 3.4% in 2017-2018 to 7.5% in 2021-2022.
- ASD diagnoses also rose from 0.2% in 2017-2018 to 0.7% in 2021-2022, with both increases remaining statistically significant after adjusting for demographic differences.
- Investigators noted that these findings underscore the need for targeted, trauma-informed intervention strategies in college settings.
IN PRACTICE:
“These trends highlight the escalating mental health challenges among college students, which is consistent with recent research reporting a surge in psychiatric diagnoses,” the authors wrote. “Factors contributing to this rise may include pandemic-related stressors (eg, loss of loved ones) and the effect of traumatic events (eg, campus shootings and racial trauma),” they added.
SOURCE:
The study was led by Yusen Zhai, PhD, University of Alabama at Birmingham. It was published online on May 30, 2024, in JAMA Network Open.
LIMITATIONS:
The study’s reliance on self-reported data and single questions for diagnosed PTSD and ASD may have limited the accuracy of the findings. The retrospective design and the absence of longitudinal follow-up may have restricted the ability to infer causality from the observed trends.
DISCLOSURES:
No disclosures were reported. No funding information was available.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
A version of this article appeared on Medscape.com.
TOPLINE:
Posttraumatic stress disorder (PTSD) rates among college students more than doubled between 2017 and 2022, new data showed. Rates of acute stress disorder (ASD) also increased during that time.
METHODOLOGY:
- Researchers conducted five waves of cross-sectional study from 2017 to 2022, involving 392,377 participants across 332 colleges and universities.
- The study utilized the Healthy Minds Study data, ensuring representativeness by applying sample weights based on institutional demographics.
- Outcome variables were diagnoses of PTSD and ASD, confirmed by healthcare practitioners, with statistical analysis assessing change in odds of estimated prevalence during 2017-2022.
TAKEAWAY:
- The prevalence of PTSD among US college students increased from 3.4% in 2017-2018 to 7.5% in 2021-2022.
- ASD diagnoses also rose from 0.2% in 2017-2018 to 0.7% in 2021-2022, with both increases remaining statistically significant after adjusting for demographic differences.
- Investigators noted that these findings underscore the need for targeted, trauma-informed intervention strategies in college settings.
IN PRACTICE:
“These trends highlight the escalating mental health challenges among college students, which is consistent with recent research reporting a surge in psychiatric diagnoses,” the authors wrote. “Factors contributing to this rise may include pandemic-related stressors (eg, loss of loved ones) and the effect of traumatic events (eg, campus shootings and racial trauma),” they added.
SOURCE:
The study was led by Yusen Zhai, PhD, University of Alabama at Birmingham. It was published online on May 30, 2024, in JAMA Network Open.
LIMITATIONS:
The study’s reliance on self-reported data and single questions for diagnosed PTSD and ASD may have limited the accuracy of the findings. The retrospective design and the absence of longitudinal follow-up may have restricted the ability to infer causality from the observed trends.
DISCLOSURES:
No disclosures were reported. No funding information was available.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication.
A version of this article appeared on Medscape.com.
Early Memory Problems Linked to Increased Tau
Reports from older adults and their partners of early memory issues are associated with higher levels of tau neurofibrillary tangles in the brain, new research suggests.
The findings show that in addition to beta-amyloid, tau is implicated in cognitive decline even in the absence of overt clinical symptoms.
“Understanding the earliest signs of Alzheimer’s disease is even more important now that new disease-modifying drugs are becoming available,” study author
Rebecca E. Amariglio, PhD, clinical neuropsychologist at Brigham and Women’s Hospital and the Massachusetts General Hospital and assistant professor in neurology at Harvard Medical School, Boston, said in a news release. “Our study found early suspicions of memory problems by both participants and the people who knew them well were linked to higher levels of tau tangles in the brain.”
The study was published online in Neurology.
Subjective Cognitive Decline
Beta-amyloid plaque accumulations and tau neurofibrillary tangles both underlie the clinical continuum of Alzheimer’s disease (AD). Previous studies have investigated beta-amyloid burden and self- and partner-reported cognitive decline, but fewer have examined regional tau.
Subjective cognitive decline may be an early sign of AD, but self-awareness declines as individuals become increasingly symptomatic. So, a report from a partner about the participant’s level of cognitive functioning is often required in studies of mild cognitive impairment and dementia. The relevance of this model during the preclinical stage is less clear.
For the multicohort, cross-sectional study, investigators studied 675 cognitively unimpaired older adults (mean age, 72 years; 59% female), including persons with nonelevated beta-amyloid levels and those with elevated beta-amyloid levels, as determined by PET.
Participants brought a spouse, adult child, or other study partner with them to answer questions about the participant’s cognitive abilities and their ability to complete daily tasks. About 65% of participants lived with their partners and both completed the Cognitive Function Index (CFI) to assess cognitive decline, with higher scores indicating greater cognitive decline.
Covariates included age, sex, education, and cohort as well as objective cognitive performance.
The Value of Partner Reporting
Investigators found that higher tau levels were associated with greater self- and partner-reported cognitive decline (P < .001 for both).
Significant associations between self- and partner-reported CFI measures were driven by elevated beta-amyloid levels, with continuous beta-amyloid levels showing an independent effect on CFI in addition to tau.
“Our findings suggest that asking older people who have elevated Alzheimer’s disease biomarkers about subjective cognitive decline may be valuable for early detection,” Dr. Amariglio said.
Limitations include the fact that most participants were White and highly educated. Future studies should include participants from more diverse racial and ethnic groups and people with diverse levels of education, researchers noted.
“Although this study was cross-sectional, findings suggest that among older CU individuals who at risk for AD dementia, capturing self-report and study partner report of cognitive function may be valuable for understanding the relationship between early pathophysiologic progression and the emergence of functional impairment,” the authors concluded.
The study was funded in part by the National Institute on Aging, Eli Lily, and the Alzheimer’s Association, among others. Dr. Amariglio receives research funding from the National Institute on Aging. Complete study funding and other authors’ disclosures are listed in the original paper.
A version of this article first appeared on Medscape.com.
Reports from older adults and their partners of early memory issues are associated with higher levels of tau neurofibrillary tangles in the brain, new research suggests.
The findings show that in addition to beta-amyloid, tau is implicated in cognitive decline even in the absence of overt clinical symptoms.
“Understanding the earliest signs of Alzheimer’s disease is even more important now that new disease-modifying drugs are becoming available,” study author
Rebecca E. Amariglio, PhD, clinical neuropsychologist at Brigham and Women’s Hospital and the Massachusetts General Hospital and assistant professor in neurology at Harvard Medical School, Boston, said in a news release. “Our study found early suspicions of memory problems by both participants and the people who knew them well were linked to higher levels of tau tangles in the brain.”
The study was published online in Neurology.
Subjective Cognitive Decline
Beta-amyloid plaque accumulations and tau neurofibrillary tangles both underlie the clinical continuum of Alzheimer’s disease (AD). Previous studies have investigated beta-amyloid burden and self- and partner-reported cognitive decline, but fewer have examined regional tau.
Subjective cognitive decline may be an early sign of AD, but self-awareness declines as individuals become increasingly symptomatic. So, a report from a partner about the participant’s level of cognitive functioning is often required in studies of mild cognitive impairment and dementia. The relevance of this model during the preclinical stage is less clear.
For the multicohort, cross-sectional study, investigators studied 675 cognitively unimpaired older adults (mean age, 72 years; 59% female), including persons with nonelevated beta-amyloid levels and those with elevated beta-amyloid levels, as determined by PET.
Participants brought a spouse, adult child, or other study partner with them to answer questions about the participant’s cognitive abilities and their ability to complete daily tasks. About 65% of participants lived with their partners and both completed the Cognitive Function Index (CFI) to assess cognitive decline, with higher scores indicating greater cognitive decline.
Covariates included age, sex, education, and cohort as well as objective cognitive performance.
The Value of Partner Reporting
Investigators found that higher tau levels were associated with greater self- and partner-reported cognitive decline (P < .001 for both).
Significant associations between self- and partner-reported CFI measures were driven by elevated beta-amyloid levels, with continuous beta-amyloid levels showing an independent effect on CFI in addition to tau.
“Our findings suggest that asking older people who have elevated Alzheimer’s disease biomarkers about subjective cognitive decline may be valuable for early detection,” Dr. Amariglio said.
Limitations include the fact that most participants were White and highly educated. Future studies should include participants from more diverse racial and ethnic groups and people with diverse levels of education, researchers noted.
“Although this study was cross-sectional, findings suggest that among older CU individuals who at risk for AD dementia, capturing self-report and study partner report of cognitive function may be valuable for understanding the relationship between early pathophysiologic progression and the emergence of functional impairment,” the authors concluded.
The study was funded in part by the National Institute on Aging, Eli Lily, and the Alzheimer’s Association, among others. Dr. Amariglio receives research funding from the National Institute on Aging. Complete study funding and other authors’ disclosures are listed in the original paper.
A version of this article first appeared on Medscape.com.
Reports from older adults and their partners of early memory issues are associated with higher levels of tau neurofibrillary tangles in the brain, new research suggests.
The findings show that in addition to beta-amyloid, tau is implicated in cognitive decline even in the absence of overt clinical symptoms.
“Understanding the earliest signs of Alzheimer’s disease is even more important now that new disease-modifying drugs are becoming available,” study author
Rebecca E. Amariglio, PhD, clinical neuropsychologist at Brigham and Women’s Hospital and the Massachusetts General Hospital and assistant professor in neurology at Harvard Medical School, Boston, said in a news release. “Our study found early suspicions of memory problems by both participants and the people who knew them well were linked to higher levels of tau tangles in the brain.”
The study was published online in Neurology.
Subjective Cognitive Decline
Beta-amyloid plaque accumulations and tau neurofibrillary tangles both underlie the clinical continuum of Alzheimer’s disease (AD). Previous studies have investigated beta-amyloid burden and self- and partner-reported cognitive decline, but fewer have examined regional tau.
Subjective cognitive decline may be an early sign of AD, but self-awareness declines as individuals become increasingly symptomatic. So, a report from a partner about the participant’s level of cognitive functioning is often required in studies of mild cognitive impairment and dementia. The relevance of this model during the preclinical stage is less clear.
For the multicohort, cross-sectional study, investigators studied 675 cognitively unimpaired older adults (mean age, 72 years; 59% female), including persons with nonelevated beta-amyloid levels and those with elevated beta-amyloid levels, as determined by PET.
Participants brought a spouse, adult child, or other study partner with them to answer questions about the participant’s cognitive abilities and their ability to complete daily tasks. About 65% of participants lived with their partners and both completed the Cognitive Function Index (CFI) to assess cognitive decline, with higher scores indicating greater cognitive decline.
Covariates included age, sex, education, and cohort as well as objective cognitive performance.
The Value of Partner Reporting
Investigators found that higher tau levels were associated with greater self- and partner-reported cognitive decline (P < .001 for both).
Significant associations between self- and partner-reported CFI measures were driven by elevated beta-amyloid levels, with continuous beta-amyloid levels showing an independent effect on CFI in addition to tau.
“Our findings suggest that asking older people who have elevated Alzheimer’s disease biomarkers about subjective cognitive decline may be valuable for early detection,” Dr. Amariglio said.
Limitations include the fact that most participants were White and highly educated. Future studies should include participants from more diverse racial and ethnic groups and people with diverse levels of education, researchers noted.
“Although this study was cross-sectional, findings suggest that among older CU individuals who at risk for AD dementia, capturing self-report and study partner report of cognitive function may be valuable for understanding the relationship between early pathophysiologic progression and the emergence of functional impairment,” the authors concluded.
The study was funded in part by the National Institute on Aging, Eli Lily, and the Alzheimer’s Association, among others. Dr. Amariglio receives research funding from the National Institute on Aging. Complete study funding and other authors’ disclosures are listed in the original paper.
A version of this article first appeared on Medscape.com.
Knowing My Limits
The records came in by fax. A patient who’d recently moved here and needed to connect with a local neurologist.
When I had time, I flipped through the records. He needed ongoing treatment for a rare neurological disease that I’d heard of, but wasn’t otherwise familiar with. It didn’t even exist in the textbooks or conferences when I was in residency. I’d never seen a case of it, just read about it here and there in journals.
I looked it up, reviewed current treatment options, monitoring, and other knowledge about it, then stared at the notes for a minute. Finally, after thinking it over, I attached a sticky note for my secretary that, if the person called, to redirect them to one of the local subspecialty neurology centers.
I have nothing against this patient, but realistically he would be better served seeing someone with time to keep up on advancements in esoteric disorders, not a general neurologist like myself.
Isn’t that why we have subspecialty centers?
Some of it is also me. There was a time in my career when keeping up on newly discovered disorders and their treatments was, well, cool. But after 25 years in practice, that changes.
It’s important to be at least somewhat aware of new developments (such as in this case) as you may encounter them, and need to know when it’s something you can handle and when to send it elsewhere.
Driving home that afternoon I thought, “I’m an old dog. I don’t want to learn new tricks.” Maybe that’s all it is. There are other neurologists my age and older who thrive on the challenge of learning about and treating new and rare disorders that were unknown when they started out. There’s nothing wrong with that.
But I’ve never pretended to be an academic or sub-sub-specialist. My patients depend on me to stay up to date on the large number of commonly seen neurological disorders, and I do my best to do that.
It ain’t easy being an old dog.
Dr. Block has a solo neurology practice in Scottsdale, Arizona.
The records came in by fax. A patient who’d recently moved here and needed to connect with a local neurologist.
When I had time, I flipped through the records. He needed ongoing treatment for a rare neurological disease that I’d heard of, but wasn’t otherwise familiar with. It didn’t even exist in the textbooks or conferences when I was in residency. I’d never seen a case of it, just read about it here and there in journals.
I looked it up, reviewed current treatment options, monitoring, and other knowledge about it, then stared at the notes for a minute. Finally, after thinking it over, I attached a sticky note for my secretary that, if the person called, to redirect them to one of the local subspecialty neurology centers.
I have nothing against this patient, but realistically he would be better served seeing someone with time to keep up on advancements in esoteric disorders, not a general neurologist like myself.
Isn’t that why we have subspecialty centers?
Some of it is also me. There was a time in my career when keeping up on newly discovered disorders and their treatments was, well, cool. But after 25 years in practice, that changes.
It’s important to be at least somewhat aware of new developments (such as in this case) as you may encounter them, and need to know when it’s something you can handle and when to send it elsewhere.
Driving home that afternoon I thought, “I’m an old dog. I don’t want to learn new tricks.” Maybe that’s all it is. There are other neurologists my age and older who thrive on the challenge of learning about and treating new and rare disorders that were unknown when they started out. There’s nothing wrong with that.
But I’ve never pretended to be an academic or sub-sub-specialist. My patients depend on me to stay up to date on the large number of commonly seen neurological disorders, and I do my best to do that.
It ain’t easy being an old dog.
Dr. Block has a solo neurology practice in Scottsdale, Arizona.
The records came in by fax. A patient who’d recently moved here and needed to connect with a local neurologist.
When I had time, I flipped through the records. He needed ongoing treatment for a rare neurological disease that I’d heard of, but wasn’t otherwise familiar with. It didn’t even exist in the textbooks or conferences when I was in residency. I’d never seen a case of it, just read about it here and there in journals.
I looked it up, reviewed current treatment options, monitoring, and other knowledge about it, then stared at the notes for a minute. Finally, after thinking it over, I attached a sticky note for my secretary that, if the person called, to redirect them to one of the local subspecialty neurology centers.
I have nothing against this patient, but realistically he would be better served seeing someone with time to keep up on advancements in esoteric disorders, not a general neurologist like myself.
Isn’t that why we have subspecialty centers?
Some of it is also me. There was a time in my career when keeping up on newly discovered disorders and their treatments was, well, cool. But after 25 years in practice, that changes.
It’s important to be at least somewhat aware of new developments (such as in this case) as you may encounter them, and need to know when it’s something you can handle and when to send it elsewhere.
Driving home that afternoon I thought, “I’m an old dog. I don’t want to learn new tricks.” Maybe that’s all it is. There are other neurologists my age and older who thrive on the challenge of learning about and treating new and rare disorders that were unknown when they started out. There’s nothing wrong with that.
But I’ve never pretended to be an academic or sub-sub-specialist. My patients depend on me to stay up to date on the large number of commonly seen neurological disorders, and I do my best to do that.
It ain’t easy being an old dog.
Dr. Block has a solo neurology practice in Scottsdale, Arizona.
New Era? ‘Double Selective’ Antibiotic Spares the Microbiome
A new antibiotic uses a never-before-seen mechanism to deliver a direct hit on tough-to-treat infections while leaving beneficial microbes alone. The strategy could lead to a new class of antibiotics that attack dangerous bacteria in a powerful new way, overcoming current drug resistance while sparing the gut microbiome.
“The biggest takeaway is the double-selective component,” said co-lead author Kristen A. Muñoz, PhD, who performed the research as a doctoral student at University of Illinois at Urbana-Champaign (UIUC). “We were able to develop a drug that not only targets problematic pathogens, but because it is selective for these pathogens only, we can spare the good bacteria and preserve the integrity of the microbiome.”
The drug goes after Gram-negative bacteria — pathogens responsible for debilitating and even fatal infections like gastroenteritis, urinary tract infections, pneumonia, sepsis, and cholera. The arsenal of antibiotics against them is old, with no new classes specifically targeting these bacteria coming on the market since 1968.
Many of these bugs have become resistant to one or more antibiotics, with deadly consequences. And antibiotics against them can also wipe out beneficial gut bacteria, allowing serious secondary infections to flare up.
In a study published in Nature, the drug lolamicin knocked out or reduced 130 strains of antibiotic-resistant Gram-negative bacteria in cell cultures. It also successfully treated drug-resistant bloodstream infections and pneumonia in mice while sparing their gut microbiome.
With their microbiomes intact, the mice then fought off secondary infection with Clostridioides difficile (a leading cause of opportunistic and sometimes fatal infections in US health care facilities), while mice treated with other compounds that damaged their microbiome succumbed.
How It Works
Like a well-built medieval castle, Gram-negative bacteria are encased in two protective walls, or membranes. Dr. Muñoz and her team at UIUC set out to breach this defense by finding compounds that hinder the “Lol system,” which ferries lipoproteins between them.
From one compound they constructed lolamicin, which can stop Gram-negative pathogens — with little effect on Gram-negative beneficial bacteria and no effect on Gram-positive bacteria.
“Gram-positive bacteria do not have an outer membrane, so they do not possess the Lol system,” Dr. Muñoz said. “When we compared the sequences of the Lol system in certain Gram-negative pathogens to Gram-negative commensal [beneficial] gut bacteria, we saw that the Lol systems were pretty different.”
Tossing a monkey wrench into the Lol system may be the study’s biggest contribution to future antibiotic development, said Kim Lewis, PhD, professor of Biology and director of Antimicrobial Discovery Center at Northeastern University, Boston, who has discovered several antibiotics now in preclinical research. One, darobactin, targets Gram-negative bugs without affecting the gut microbiome. Another, teixobactin, takes down Gram-positive bacteria without causing drug resistance.
“Lolamicin hits a novel target. I would say that’s the most significant study finding,” said Dr. Lewis, who was not involved in the study. “That is rare. If you look at antibiotics introduced since 1968, they have been modifications of existing antibiotics or, rarely, new chemically but hitting the same proven targets. This one hits something properly new, and [that’s] what I found perhaps the most original and interesting.”
Kirk E. Hevener, PharmD, PhD, associate professor of Pharmaceutical Sciences at the University of Tennessee Health Science Center, Memphis, Tennessee, agreed. (Dr. Hevener also was not involved in the study.) “Lolamicin works by targeting a unique Gram-negative transport system. No currently approved antibacterials work in this way, meaning it potentially represents the first of a new class of antibacterials with narrow-spectrum Gram-negative activity and low gastrointestinal disturbance,” said Dr. Hevener, whose research looks at new antimicrobial drug targets.
The UIUC researchers noted that lolamicin has one drawback: Bacteria frequently developed resistance to it. But in future work, it could be tweaked, combined with other antibiotics, or used as a template for finding other Lol system attackers, they said.
“There is still a good amount of work cut out for us in terms of assessing the clinical translatability of lolamicin, but we are hopeful for the future of this drug,” Dr. Muñoz said.
Addressing a Dire Need
Bringing such a drug to market — from discovery to Food and Drug Administration approval — could take more than a decade, said Dr. Hevener. And new agents, especially for Gram-negative bugs, are sorely needed.
Not only do these bacteria shield themselves with a double membrane but they also “have more complex resistance mechanisms including special pumps that can remove antibacterial drugs from the cell before they can be effective,” Dr. Hevener said.
As a result, drug-resistant Gram-negative bacteria are making treatment of severe infections such as sepsis and pneumonia in health care settings difficult.
Bloodstream infections with drug-resistant Klebsiella pneumoniae have a 40% mortality rate, Dr. Lewis said. And microbiome damage caused by antibiotics is also widespread and deadly, wiping out communities of helpful, protective gut bacteria. That contributes to over half of the C. difficile infections that affect 500,000 people and kill 30,000 a year in the United States.
“Our arsenal of antibacterials that can be used to treat Gram-negative infections is dangerously low,” Dr. Hevener said. “Research will always be needed to develop new antibacterials with novel mechanisms of activity that can bypass bacterial resistance mechanisms.”
A version of this article appeared on Medscape.com.
A new antibiotic uses a never-before-seen mechanism to deliver a direct hit on tough-to-treat infections while leaving beneficial microbes alone. The strategy could lead to a new class of antibiotics that attack dangerous bacteria in a powerful new way, overcoming current drug resistance while sparing the gut microbiome.
“The biggest takeaway is the double-selective component,” said co-lead author Kristen A. Muñoz, PhD, who performed the research as a doctoral student at University of Illinois at Urbana-Champaign (UIUC). “We were able to develop a drug that not only targets problematic pathogens, but because it is selective for these pathogens only, we can spare the good bacteria and preserve the integrity of the microbiome.”
The drug goes after Gram-negative bacteria — pathogens responsible for debilitating and even fatal infections like gastroenteritis, urinary tract infections, pneumonia, sepsis, and cholera. The arsenal of antibiotics against them is old, with no new classes specifically targeting these bacteria coming on the market since 1968.
Many of these bugs have become resistant to one or more antibiotics, with deadly consequences. And antibiotics against them can also wipe out beneficial gut bacteria, allowing serious secondary infections to flare up.
In a study published in Nature, the drug lolamicin knocked out or reduced 130 strains of antibiotic-resistant Gram-negative bacteria in cell cultures. It also successfully treated drug-resistant bloodstream infections and pneumonia in mice while sparing their gut microbiome.
With their microbiomes intact, the mice then fought off secondary infection with Clostridioides difficile (a leading cause of opportunistic and sometimes fatal infections in US health care facilities), while mice treated with other compounds that damaged their microbiome succumbed.
How It Works
Like a well-built medieval castle, Gram-negative bacteria are encased in two protective walls, or membranes. Dr. Muñoz and her team at UIUC set out to breach this defense by finding compounds that hinder the “Lol system,” which ferries lipoproteins between them.
From one compound they constructed lolamicin, which can stop Gram-negative pathogens — with little effect on Gram-negative beneficial bacteria and no effect on Gram-positive bacteria.
“Gram-positive bacteria do not have an outer membrane, so they do not possess the Lol system,” Dr. Muñoz said. “When we compared the sequences of the Lol system in certain Gram-negative pathogens to Gram-negative commensal [beneficial] gut bacteria, we saw that the Lol systems were pretty different.”
Tossing a monkey wrench into the Lol system may be the study’s biggest contribution to future antibiotic development, said Kim Lewis, PhD, professor of Biology and director of Antimicrobial Discovery Center at Northeastern University, Boston, who has discovered several antibiotics now in preclinical research. One, darobactin, targets Gram-negative bugs without affecting the gut microbiome. Another, teixobactin, takes down Gram-positive bacteria without causing drug resistance.
“Lolamicin hits a novel target. I would say that’s the most significant study finding,” said Dr. Lewis, who was not involved in the study. “That is rare. If you look at antibiotics introduced since 1968, they have been modifications of existing antibiotics or, rarely, new chemically but hitting the same proven targets. This one hits something properly new, and [that’s] what I found perhaps the most original and interesting.”
Kirk E. Hevener, PharmD, PhD, associate professor of Pharmaceutical Sciences at the University of Tennessee Health Science Center, Memphis, Tennessee, agreed. (Dr. Hevener also was not involved in the study.) “Lolamicin works by targeting a unique Gram-negative transport system. No currently approved antibacterials work in this way, meaning it potentially represents the first of a new class of antibacterials with narrow-spectrum Gram-negative activity and low gastrointestinal disturbance,” said Dr. Hevener, whose research looks at new antimicrobial drug targets.
The UIUC researchers noted that lolamicin has one drawback: Bacteria frequently developed resistance to it. But in future work, it could be tweaked, combined with other antibiotics, or used as a template for finding other Lol system attackers, they said.
“There is still a good amount of work cut out for us in terms of assessing the clinical translatability of lolamicin, but we are hopeful for the future of this drug,” Dr. Muñoz said.
Addressing a Dire Need
Bringing such a drug to market — from discovery to Food and Drug Administration approval — could take more than a decade, said Dr. Hevener. And new agents, especially for Gram-negative bugs, are sorely needed.
Not only do these bacteria shield themselves with a double membrane but they also “have more complex resistance mechanisms including special pumps that can remove antibacterial drugs from the cell before they can be effective,” Dr. Hevener said.
As a result, drug-resistant Gram-negative bacteria are making treatment of severe infections such as sepsis and pneumonia in health care settings difficult.
Bloodstream infections with drug-resistant Klebsiella pneumoniae have a 40% mortality rate, Dr. Lewis said. And microbiome damage caused by antibiotics is also widespread and deadly, wiping out communities of helpful, protective gut bacteria. That contributes to over half of the C. difficile infections that affect 500,000 people and kill 30,000 a year in the United States.
“Our arsenal of antibacterials that can be used to treat Gram-negative infections is dangerously low,” Dr. Hevener said. “Research will always be needed to develop new antibacterials with novel mechanisms of activity that can bypass bacterial resistance mechanisms.”
A version of this article appeared on Medscape.com.
A new antibiotic uses a never-before-seen mechanism to deliver a direct hit on tough-to-treat infections while leaving beneficial microbes alone. The strategy could lead to a new class of antibiotics that attack dangerous bacteria in a powerful new way, overcoming current drug resistance while sparing the gut microbiome.
“The biggest takeaway is the double-selective component,” said co-lead author Kristen A. Muñoz, PhD, who performed the research as a doctoral student at University of Illinois at Urbana-Champaign (UIUC). “We were able to develop a drug that not only targets problematic pathogens, but because it is selective for these pathogens only, we can spare the good bacteria and preserve the integrity of the microbiome.”
The drug goes after Gram-negative bacteria — pathogens responsible for debilitating and even fatal infections like gastroenteritis, urinary tract infections, pneumonia, sepsis, and cholera. The arsenal of antibiotics against them is old, with no new classes specifically targeting these bacteria coming on the market since 1968.
Many of these bugs have become resistant to one or more antibiotics, with deadly consequences. And antibiotics against them can also wipe out beneficial gut bacteria, allowing serious secondary infections to flare up.
In a study published in Nature, the drug lolamicin knocked out or reduced 130 strains of antibiotic-resistant Gram-negative bacteria in cell cultures. It also successfully treated drug-resistant bloodstream infections and pneumonia in mice while sparing their gut microbiome.
With their microbiomes intact, the mice then fought off secondary infection with Clostridioides difficile (a leading cause of opportunistic and sometimes fatal infections in US health care facilities), while mice treated with other compounds that damaged their microbiome succumbed.
How It Works
Like a well-built medieval castle, Gram-negative bacteria are encased in two protective walls, or membranes. Dr. Muñoz and her team at UIUC set out to breach this defense by finding compounds that hinder the “Lol system,” which ferries lipoproteins between them.
From one compound they constructed lolamicin, which can stop Gram-negative pathogens — with little effect on Gram-negative beneficial bacteria and no effect on Gram-positive bacteria.
“Gram-positive bacteria do not have an outer membrane, so they do not possess the Lol system,” Dr. Muñoz said. “When we compared the sequences of the Lol system in certain Gram-negative pathogens to Gram-negative commensal [beneficial] gut bacteria, we saw that the Lol systems were pretty different.”
Tossing a monkey wrench into the Lol system may be the study’s biggest contribution to future antibiotic development, said Kim Lewis, PhD, professor of Biology and director of Antimicrobial Discovery Center at Northeastern University, Boston, who has discovered several antibiotics now in preclinical research. One, darobactin, targets Gram-negative bugs without affecting the gut microbiome. Another, teixobactin, takes down Gram-positive bacteria without causing drug resistance.
“Lolamicin hits a novel target. I would say that’s the most significant study finding,” said Dr. Lewis, who was not involved in the study. “That is rare. If you look at antibiotics introduced since 1968, they have been modifications of existing antibiotics or, rarely, new chemically but hitting the same proven targets. This one hits something properly new, and [that’s] what I found perhaps the most original and interesting.”
Kirk E. Hevener, PharmD, PhD, associate professor of Pharmaceutical Sciences at the University of Tennessee Health Science Center, Memphis, Tennessee, agreed. (Dr. Hevener also was not involved in the study.) “Lolamicin works by targeting a unique Gram-negative transport system. No currently approved antibacterials work in this way, meaning it potentially represents the first of a new class of antibacterials with narrow-spectrum Gram-negative activity and low gastrointestinal disturbance,” said Dr. Hevener, whose research looks at new antimicrobial drug targets.
The UIUC researchers noted that lolamicin has one drawback: Bacteria frequently developed resistance to it. But in future work, it could be tweaked, combined with other antibiotics, or used as a template for finding other Lol system attackers, they said.
“There is still a good amount of work cut out for us in terms of assessing the clinical translatability of lolamicin, but we are hopeful for the future of this drug,” Dr. Muñoz said.
Addressing a Dire Need
Bringing such a drug to market — from discovery to Food and Drug Administration approval — could take more than a decade, said Dr. Hevener. And new agents, especially for Gram-negative bugs, are sorely needed.
Not only do these bacteria shield themselves with a double membrane but they also “have more complex resistance mechanisms including special pumps that can remove antibacterial drugs from the cell before they can be effective,” Dr. Hevener said.
As a result, drug-resistant Gram-negative bacteria are making treatment of severe infections such as sepsis and pneumonia in health care settings difficult.
Bloodstream infections with drug-resistant Klebsiella pneumoniae have a 40% mortality rate, Dr. Lewis said. And microbiome damage caused by antibiotics is also widespread and deadly, wiping out communities of helpful, protective gut bacteria. That contributes to over half of the C. difficile infections that affect 500,000 people and kill 30,000 a year in the United States.
“Our arsenal of antibacterials that can be used to treat Gram-negative infections is dangerously low,” Dr. Hevener said. “Research will always be needed to develop new antibacterials with novel mechanisms of activity that can bypass bacterial resistance mechanisms.”
A version of this article appeared on Medscape.com.
Vitamin D Test Inaccuracies Persist Despite Gains in Field: CDC
Some vitamin D tests may give misleading results despite progress made in recent years to improve the quality of these assays, according to the US Centers for Disease Control and Prevention (CDC).
Otoe Sugahara manager of the CDC Vitamin D Standardization-Certification Program (VDSCP), presented an update of her group’s work at ENDO 2024, the Endocrine Society’s annual meeting in Boston.
“Though most vitamin D tests in our program have improved, there still remain some sample-specific inaccuracies. The CDC is working with program participants to address these situations,” Ms. Sugahara said in a statement released by the Endocrine Society.
For example, some assays measure other compounds besides 25-hydroxyvitamin D, which can falsely elevate results of some blood samples, Ms. Sugahara reported. Thus, some tests may be misclassified, with results seen as sufficient from samples that should have indicated a vitamin D deficiency.
“While most vitamin D tests are effective, it is important for healthcare providers to be aware of the potential inconsistencies associated with vitamin D tests to avoid misclassification of the patients,” Ms. Sugahara and coauthors said in an abstract provided by the Endocrine Society.
Ms. Sugahara’s report provided a snapshot of the state of longstanding efforts to improve the quality of a widely performed service in US healthcare: testing vitamin D levels.
These include an international collaboration that gave rise in 2010 to a vitamin D standardization program, from which the CDC’s VDSCP certification emerged. Among the leaders of these efforts was Christopher Sempos, PhD, then with the Office of Dietary Supplements at the National Institutes of Health.
Many clinicians may not be aware of the concerns about the accuracy of vitamin D tests that led to the drive for standardization, Dr. Sempos, now retired, said in an interview. And, in his view, it’s something that busy practitioners should not have to consider.
“They have literally thousands of diseases they have to be able to recognize and diagnose,” Dr. Sempos said. “They should be able to count on the laboratory system to give them accurate and precise data.”
‘Nudging’ Toward Better Results
The CDC’s certification program gives labs and companies detailed information about the analytical accuracy and precision of their vitamin D tests.
This feedback has paid off with improved results, Andy Hoofnagle, MD, PhD, professor of laboratory medicine and pathology at the University of Washington in Seattle, told this news organization. It helps by “nudging manufacturers in the right direction,” he said.
“Some manufacturers reformulated, others recalibrated, which is a lot of effort on their part, so that when the patient get a number, it actually means the right thing,” said Dr. Hoofnagle, who is also chair of the Accuracy-Based Programs Committee of the College of American Pathologists.
“There are still many immunoassays on the market that aren’t giving the correct results, unfortunately, but the standardization certification program has really pushed the field in the right direction,” he said.
US scientists use two main types of technologies to measure vitamin D in the blood, Ms. Sugahara said. One is mass spectrometry, which separately measures 25-hydroxyvitamin D2 and D3 and sums the values. The other type, immunoassay, measures both compounds at the same time and reports one result for total 25-hydroxyvitamin D.
At the ENDO 2024 meeting, Ms. Sugahara reported generally positive trends seen in the VDSCP. For example, the program looks at specific tests’ bias, or the deviation of test results from the true value, as determined with the CDC’s reference method for vitamin D.
Average calibration bias was less than 1% for all assays in the VDSCP in 2022, Ms. Sugahara said. The average calibration bias for immunoassays was 0.86%, and for assays using mass spectrometry, it was 0.55%, Ms. Sugahara reported.
These are improved results compared with 2019 data, in which mass spectrometry–based assays had a mean bias of 1.9% and immunoassays had a mean bias of 2.4%, the CDC told this news organization in an email exchange.
The CDC said the VDSCP supports laboratories and researchers from around the world, including ones based in the US, China, Australia, Japan, and Korea.
Call for Research
Vitamin D tests are widely administered despite questions about their benefit for people who do not appear likely to be deficient of it.
The Endocrine Society’s newly released practice guideline recommends against routine testing of blood vitamin D levels in the general population.
Laboratory testing has increased over the years owing to studies reporting associations between blood vitamin D [25(OH)D] levels and a variety of common disorders, including musculoskeletal, metabolic, cardiovascular, malignant, autoimmune, and infectious diseases, wrote Marie B. Demay, MD, of Harvard Medical School in Boston, and coauthors in the new guideline. It was published on June 3 in The Journal of Clinical Endocrinology & Metabolism.
‘”Although a causal link between serum 25(OH)D concentrations and many disorders has not been clearly established, these associations have led to widespread supplementation with vitamin D and increased laboratory testing for 25(OH)D in the general population,” they wrote.
It’s uncertain that “any putative benefits of screening would outweigh the increased burden and cost, and whether implementation of universal 25(OH)D screening would be feasible from a societal perspective,” Dr. Demay and coauthors added.
They noted that the influential US Preventive Services Task Force also has raised doubts about widespread use of vitamin D tests.
The USPSTF has a somewhat different take from the Endocrine Society. The task force in 2021 reiterated its view that there is not enough evidence to recommend for or against widespread vitamin D testing for adults. The task force gave this test an I grade, meaning there is insufficient evidence to weigh the risks and benefits. That’s the same grade the task force gave it in 2014.
The USPSTF uses a grade of D to recommend against use of a test or service.
In an interview with this news organization, John Wong, MD, vice chair of the USPSTF, reiterated his group’s call for more research into the potential benefits and harms of vitamin D screening.
One of the challenges in addressing this issue, Dr. Wong noted, has been the variability of test results. Therefore, efforts such as the CDC’s VDSCP in improving test quality may help in eventually building up the kind of evidence base needed for the task force to offer a more definitive judgment on the tests, he said.
Wong acknowledged it must be frustrating for clinicians and patients to hear that experts don’t have the evidence needed to make a broad call about whether routine vitamin D tests are beneficial.
“We really would like to have that evidence because we recognize that it’s an important health question to help everybody in this nation stay healthy and live longer,” Dr. Wong said.
A version of this article appeared on Medscape.com.
Some vitamin D tests may give misleading results despite progress made in recent years to improve the quality of these assays, according to the US Centers for Disease Control and Prevention (CDC).
Otoe Sugahara manager of the CDC Vitamin D Standardization-Certification Program (VDSCP), presented an update of her group’s work at ENDO 2024, the Endocrine Society’s annual meeting in Boston.
“Though most vitamin D tests in our program have improved, there still remain some sample-specific inaccuracies. The CDC is working with program participants to address these situations,” Ms. Sugahara said in a statement released by the Endocrine Society.
For example, some assays measure other compounds besides 25-hydroxyvitamin D, which can falsely elevate results of some blood samples, Ms. Sugahara reported. Thus, some tests may be misclassified, with results seen as sufficient from samples that should have indicated a vitamin D deficiency.
“While most vitamin D tests are effective, it is important for healthcare providers to be aware of the potential inconsistencies associated with vitamin D tests to avoid misclassification of the patients,” Ms. Sugahara and coauthors said in an abstract provided by the Endocrine Society.
Ms. Sugahara’s report provided a snapshot of the state of longstanding efforts to improve the quality of a widely performed service in US healthcare: testing vitamin D levels.
These include an international collaboration that gave rise in 2010 to a vitamin D standardization program, from which the CDC’s VDSCP certification emerged. Among the leaders of these efforts was Christopher Sempos, PhD, then with the Office of Dietary Supplements at the National Institutes of Health.
Many clinicians may not be aware of the concerns about the accuracy of vitamin D tests that led to the drive for standardization, Dr. Sempos, now retired, said in an interview. And, in his view, it’s something that busy practitioners should not have to consider.
“They have literally thousands of diseases they have to be able to recognize and diagnose,” Dr. Sempos said. “They should be able to count on the laboratory system to give them accurate and precise data.”
‘Nudging’ Toward Better Results
The CDC’s certification program gives labs and companies detailed information about the analytical accuracy and precision of their vitamin D tests.
This feedback has paid off with improved results, Andy Hoofnagle, MD, PhD, professor of laboratory medicine and pathology at the University of Washington in Seattle, told this news organization. It helps by “nudging manufacturers in the right direction,” he said.
“Some manufacturers reformulated, others recalibrated, which is a lot of effort on their part, so that when the patient get a number, it actually means the right thing,” said Dr. Hoofnagle, who is also chair of the Accuracy-Based Programs Committee of the College of American Pathologists.
“There are still many immunoassays on the market that aren’t giving the correct results, unfortunately, but the standardization certification program has really pushed the field in the right direction,” he said.
US scientists use two main types of technologies to measure vitamin D in the blood, Ms. Sugahara said. One is mass spectrometry, which separately measures 25-hydroxyvitamin D2 and D3 and sums the values. The other type, immunoassay, measures both compounds at the same time and reports one result for total 25-hydroxyvitamin D.
At the ENDO 2024 meeting, Ms. Sugahara reported generally positive trends seen in the VDSCP. For example, the program looks at specific tests’ bias, or the deviation of test results from the true value, as determined with the CDC’s reference method for vitamin D.
Average calibration bias was less than 1% for all assays in the VDSCP in 2022, Ms. Sugahara said. The average calibration bias for immunoassays was 0.86%, and for assays using mass spectrometry, it was 0.55%, Ms. Sugahara reported.
These are improved results compared with 2019 data, in which mass spectrometry–based assays had a mean bias of 1.9% and immunoassays had a mean bias of 2.4%, the CDC told this news organization in an email exchange.
The CDC said the VDSCP supports laboratories and researchers from around the world, including ones based in the US, China, Australia, Japan, and Korea.
Call for Research
Vitamin D tests are widely administered despite questions about their benefit for people who do not appear likely to be deficient of it.
The Endocrine Society’s newly released practice guideline recommends against routine testing of blood vitamin D levels in the general population.
Laboratory testing has increased over the years owing to studies reporting associations between blood vitamin D [25(OH)D] levels and a variety of common disorders, including musculoskeletal, metabolic, cardiovascular, malignant, autoimmune, and infectious diseases, wrote Marie B. Demay, MD, of Harvard Medical School in Boston, and coauthors in the new guideline. It was published on June 3 in The Journal of Clinical Endocrinology & Metabolism.
‘”Although a causal link between serum 25(OH)D concentrations and many disorders has not been clearly established, these associations have led to widespread supplementation with vitamin D and increased laboratory testing for 25(OH)D in the general population,” they wrote.
It’s uncertain that “any putative benefits of screening would outweigh the increased burden and cost, and whether implementation of universal 25(OH)D screening would be feasible from a societal perspective,” Dr. Demay and coauthors added.
They noted that the influential US Preventive Services Task Force also has raised doubts about widespread use of vitamin D tests.
The USPSTF has a somewhat different take from the Endocrine Society. The task force in 2021 reiterated its view that there is not enough evidence to recommend for or against widespread vitamin D testing for adults. The task force gave this test an I grade, meaning there is insufficient evidence to weigh the risks and benefits. That’s the same grade the task force gave it in 2014.
The USPSTF uses a grade of D to recommend against use of a test or service.
In an interview with this news organization, John Wong, MD, vice chair of the USPSTF, reiterated his group’s call for more research into the potential benefits and harms of vitamin D screening.
One of the challenges in addressing this issue, Dr. Wong noted, has been the variability of test results. Therefore, efforts such as the CDC’s VDSCP in improving test quality may help in eventually building up the kind of evidence base needed for the task force to offer a more definitive judgment on the tests, he said.
Wong acknowledged it must be frustrating for clinicians and patients to hear that experts don’t have the evidence needed to make a broad call about whether routine vitamin D tests are beneficial.
“We really would like to have that evidence because we recognize that it’s an important health question to help everybody in this nation stay healthy and live longer,” Dr. Wong said.
A version of this article appeared on Medscape.com.
Some vitamin D tests may give misleading results despite progress made in recent years to improve the quality of these assays, according to the US Centers for Disease Control and Prevention (CDC).
Otoe Sugahara manager of the CDC Vitamin D Standardization-Certification Program (VDSCP), presented an update of her group’s work at ENDO 2024, the Endocrine Society’s annual meeting in Boston.
“Though most vitamin D tests in our program have improved, there still remain some sample-specific inaccuracies. The CDC is working with program participants to address these situations,” Ms. Sugahara said in a statement released by the Endocrine Society.
For example, some assays measure other compounds besides 25-hydroxyvitamin D, which can falsely elevate results of some blood samples, Ms. Sugahara reported. Thus, some tests may be misclassified, with results seen as sufficient from samples that should have indicated a vitamin D deficiency.
“While most vitamin D tests are effective, it is important for healthcare providers to be aware of the potential inconsistencies associated with vitamin D tests to avoid misclassification of the patients,” Ms. Sugahara and coauthors said in an abstract provided by the Endocrine Society.
Ms. Sugahara’s report provided a snapshot of the state of longstanding efforts to improve the quality of a widely performed service in US healthcare: testing vitamin D levels.
These include an international collaboration that gave rise in 2010 to a vitamin D standardization program, from which the CDC’s VDSCP certification emerged. Among the leaders of these efforts was Christopher Sempos, PhD, then with the Office of Dietary Supplements at the National Institutes of Health.
Many clinicians may not be aware of the concerns about the accuracy of vitamin D tests that led to the drive for standardization, Dr. Sempos, now retired, said in an interview. And, in his view, it’s something that busy practitioners should not have to consider.
“They have literally thousands of diseases they have to be able to recognize and diagnose,” Dr. Sempos said. “They should be able to count on the laboratory system to give them accurate and precise data.”
‘Nudging’ Toward Better Results
The CDC’s certification program gives labs and companies detailed information about the analytical accuracy and precision of their vitamin D tests.
This feedback has paid off with improved results, Andy Hoofnagle, MD, PhD, professor of laboratory medicine and pathology at the University of Washington in Seattle, told this news organization. It helps by “nudging manufacturers in the right direction,” he said.
“Some manufacturers reformulated, others recalibrated, which is a lot of effort on their part, so that when the patient get a number, it actually means the right thing,” said Dr. Hoofnagle, who is also chair of the Accuracy-Based Programs Committee of the College of American Pathologists.
“There are still many immunoassays on the market that aren’t giving the correct results, unfortunately, but the standardization certification program has really pushed the field in the right direction,” he said.
US scientists use two main types of technologies to measure vitamin D in the blood, Ms. Sugahara said. One is mass spectrometry, which separately measures 25-hydroxyvitamin D2 and D3 and sums the values. The other type, immunoassay, measures both compounds at the same time and reports one result for total 25-hydroxyvitamin D.
At the ENDO 2024 meeting, Ms. Sugahara reported generally positive trends seen in the VDSCP. For example, the program looks at specific tests’ bias, or the deviation of test results from the true value, as determined with the CDC’s reference method for vitamin D.
Average calibration bias was less than 1% for all assays in the VDSCP in 2022, Ms. Sugahara said. The average calibration bias for immunoassays was 0.86%, and for assays using mass spectrometry, it was 0.55%, Ms. Sugahara reported.
These are improved results compared with 2019 data, in which mass spectrometry–based assays had a mean bias of 1.9% and immunoassays had a mean bias of 2.4%, the CDC told this news organization in an email exchange.
The CDC said the VDSCP supports laboratories and researchers from around the world, including ones based in the US, China, Australia, Japan, and Korea.
Call for Research
Vitamin D tests are widely administered despite questions about their benefit for people who do not appear likely to be deficient of it.
The Endocrine Society’s newly released practice guideline recommends against routine testing of blood vitamin D levels in the general population.
Laboratory testing has increased over the years owing to studies reporting associations between blood vitamin D [25(OH)D] levels and a variety of common disorders, including musculoskeletal, metabolic, cardiovascular, malignant, autoimmune, and infectious diseases, wrote Marie B. Demay, MD, of Harvard Medical School in Boston, and coauthors in the new guideline. It was published on June 3 in The Journal of Clinical Endocrinology & Metabolism.
‘”Although a causal link between serum 25(OH)D concentrations and many disorders has not been clearly established, these associations have led to widespread supplementation with vitamin D and increased laboratory testing for 25(OH)D in the general population,” they wrote.
It’s uncertain that “any putative benefits of screening would outweigh the increased burden and cost, and whether implementation of universal 25(OH)D screening would be feasible from a societal perspective,” Dr. Demay and coauthors added.
They noted that the influential US Preventive Services Task Force also has raised doubts about widespread use of vitamin D tests.
The USPSTF has a somewhat different take from the Endocrine Society. The task force in 2021 reiterated its view that there is not enough evidence to recommend for or against widespread vitamin D testing for adults. The task force gave this test an I grade, meaning there is insufficient evidence to weigh the risks and benefits. That’s the same grade the task force gave it in 2014.
The USPSTF uses a grade of D to recommend against use of a test or service.
In an interview with this news organization, John Wong, MD, vice chair of the USPSTF, reiterated his group’s call for more research into the potential benefits and harms of vitamin D screening.
One of the challenges in addressing this issue, Dr. Wong noted, has been the variability of test results. Therefore, efforts such as the CDC’s VDSCP in improving test quality may help in eventually building up the kind of evidence base needed for the task force to offer a more definitive judgment on the tests, he said.
Wong acknowledged it must be frustrating for clinicians and patients to hear that experts don’t have the evidence needed to make a broad call about whether routine vitamin D tests are beneficial.
“We really would like to have that evidence because we recognize that it’s an important health question to help everybody in this nation stay healthy and live longer,” Dr. Wong said.
A version of this article appeared on Medscape.com.