User login
Hypothyroidism: No more waiting to eat or drink with liquid thyroxine?
ATLANTA -- Liquid formulations of levothyroxine offer the possibility of allowing patients with hypothyroidism to take their medication with meals or coffee and skip the currently recommended 30- to 60-minute waiting period before doing either, new data suggest.
Because food, coffee, and certain medications can interfere with intestinal absorption of levothyroxine (also known as LT4), current guidelines recommend that the drug be taken in a fasting state, typically 30-60 minutes before breakfast. However, compliance may be difficult for some patients.
Now, a potential solution may come from new evidence that liquid levothyroxine formulations that bypass the gastric dissolution phase of absorption may mitigate the interference with food and coffee.
Findings from two bioavailability studies showing no difference in comparisons of Thyquidity (levothyroxine sodium oral solution, Vertice Pharma) with or without waiting periods before consuming coffee or a high-fat meal were presented at the annual meeting of the Endocrine Society (ENDO 2022), by Vertice Pharma Medical Director Kris Washington, PharmD.
And just last month, similar data were published in Thyroid for another levothyroxine oral solution, Tirosint-SOL (IBSA). No difference in pharmacokinetic properties were found with this product with a shorter versus a longer waiting period before consuming a high-fat meal.
Liquid thyroxine may be less affected by food/drink but is expensive
Both products have been approved by the U.S. Food and Drug Administration, but current labeling for both still calls for a 30- to 60-minute waiting period between taking the medication and eating or drinking. Thyquidity is an oral solution of 100 µg/mL levothyroxine sodium that has been shown to be bioequivalent to one of the most popular branded levothyroxine tablets, Synthroid (AbbVie), under fasting conditions. Tirosint-SOL is also an oral solution that comes in 15 different dosage ampules.
“It is important to note that while these findings are exciting and encouraging, we do want you to continue to follow the current FDA-approved label for Thyquidity, recommending that it be taken on an empty stomach 30-60 minutes prior to breakfast and that patients continue to follow all other label instructions,” Dr. Washington said during a press briefing at ENDO 2022.
When asked whether the new data would be submitted to the FDA for a possible amendment to this message, she replied: “We’re still discussing that. We’re exploring all options. ... This is fairly new data. ... It makes sense and certainly solves a lot of the challenges for people who can’t swallow or don’t choose to swallow, or the challenges of splitting or crushing with tablets.”
Asked to comment, Benjamin J. Gigliotti, MD, a clinical thyroidologist at the University of Rochester, New York, told this news organization: “Liquid levothyroxine has the potential to be a clinically useful formulation,” noting that these recent data corroborate prior findings from Europe and elsewhere that liquid levothyroxine is absorbed more rapidly and thus may be less impacted by food or beverages.
However, Dr. Gigliotti also pointed out, “I don’t think malabsorption is a major contributor to suboptimal treatment because if [patients] malabsorb the hormone, we typically just increase their dose a little bit or ask them to take it separately, and that works just fine for most people.”
And the higher cost of the liquid products is a major issue, he noted.
A quick search on GoodRx shows that the lowest price of Tirosint-SOL is $115.52 for a 1 month supply and Thyquidity is $181.04/month. “In the few patients where I tried to obtain Tirosint-SOL, it was not covered by insurance, even with a prior authorization,” Dr. Gigliotti commented.
In contrast, generic levothyroxine tablets are about $4/month, while a common brand name of levothyroxine tablets are $47.81/month.
“Until these liquid formulations are more widely covered by insurance for a reasonable copay, or come down in price compared to generic levothyroxine tablets, most of my patients have voiced that they’d rather deal with the inconveniences of a tablet compared to higher medication cost, especially with rising economic insecurity imposed by the COVID-19 pandemic and recent world events,” Dr. Gigliotti said.
Bioequivalence with shorter versus longer waits before coffee/breakfast
The Thyquidity coffee study was a single-center open-label, randomized, crossover study of 40 healthy adults randomized after a 10-hour overnight fast to 600 µg Thyquidity with water under fasting conditions or to the same dose given 5 minutes prior to drinking an 8-ounce cup of American coffee without milk or sweeteners. After a 40-day washout period, the same participants received the other treatment.
Mean serum thyroxine (T4) concentrations over 48 hours were nearly identical, demonstrating comparable bioavailability. Pharmacokinetics parameters, including area under the curve (AUC) and Cmax, were also comparable for both groups. The geometric least square mean ratios for baseline-adjusted LT4 were 96.0% for Cmax and 94% for AUC. And the corresponding 90% confidence intervals fell within the 80%-125% FDA acceptance range for absence of a food effect on bioavailability, said Dr. Washington when presenting the findings.
There was one adverse event, a decrease in blood glucose level, which was deemed to be mild and unrelated to study treatment. No deaths, serious adverse events, or discontinuations due to adverse events were reported. There were no significant changes in vital signs or on ECG.
In the second Thyquidity study of 38 healthy adults, after a 10-hour fast, the same doses were given 10 or 30 minutes prior to the consumption of a 950-calorie standardized high-fat breakfast.
Again, over 48 hours, mean serum T4 levels were comparable between the two groups. The geometric least squares mean ratios for both AUC and Cmax for baseline-adjusted LT4 were 88.7% and 85.1%, respectively. Again, the corresponding 90% confidence intervals fell within the FDA’s noninterference definition, again demonstrating lack of a food effect on bioavailability, Dr. Washington noted.
Four adverse events were reported in three participants, with three deemed to be possibly related to the medication. All were isolated lab abnormalities without clinical symptoms and deemed to be mild. Three were normal on repeat testing.
There were no deaths or serious adverse events or study discontinuations for adverse events and no significant findings for vital signs or on ECG.
Similar findings for Tirosint-SOL but longer-term studies needed
The recently published Tirosint-SOL study included 36 healthy volunteers randomized to single 600-µg doses of the LT4 oral solution after a 10-hour fast, either 15 or 30 minutes before eating a standardized high-fat, high-calorie meal. Mean serum total thyroxine concentration profiles were similar for both the 15- and 30-minute waits, with similar AUCs.
Geometric mean ratios for AUCs at 48 and 72 hours were 90% and 92%, respectively, and the 90% confidence intervals fell within the 80%-125% FDA boundaries, suggesting similar exposures whether taken 15 or 30 minutes before a meal.
Senior author Francesco S. Celi, MD, chair of the division of endocrinology, diabetes, and metabolism at Virginia Commonwealth University, Richmond, told this news organization: “There is an interest in providing more opportunities for patients and improving adherence to the medication. ... Whatever makes life a bit easier for patients and results in a more predictable response to treatment means down the road there will be fewer visits to the doctor to make adjustments.”
However, he said that in addition to the cost and reimbursement issue, all of these studies have been short term and not conducted in real-life settings.
“Another question is: What happens if the patient goes on low-dose LT4? The studies were conducted on much higher pharmacologic doses. But at least from a safety standpoint, there’s no specific concern.”
Dr. Washington is an employee of Vertice Pharma. Dr. Celi has received unrestricted research grants and worked as a consultant for IBSA. Dr. Gigliotti has reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
ATLANTA -- Liquid formulations of levothyroxine offer the possibility of allowing patients with hypothyroidism to take their medication with meals or coffee and skip the currently recommended 30- to 60-minute waiting period before doing either, new data suggest.
Because food, coffee, and certain medications can interfere with intestinal absorption of levothyroxine (also known as LT4), current guidelines recommend that the drug be taken in a fasting state, typically 30-60 minutes before breakfast. However, compliance may be difficult for some patients.
Now, a potential solution may come from new evidence that liquid levothyroxine formulations that bypass the gastric dissolution phase of absorption may mitigate the interference with food and coffee.
Findings from two bioavailability studies showing no difference in comparisons of Thyquidity (levothyroxine sodium oral solution, Vertice Pharma) with or without waiting periods before consuming coffee or a high-fat meal were presented at the annual meeting of the Endocrine Society (ENDO 2022), by Vertice Pharma Medical Director Kris Washington, PharmD.
And just last month, similar data were published in Thyroid for another levothyroxine oral solution, Tirosint-SOL (IBSA). No difference in pharmacokinetic properties were found with this product with a shorter versus a longer waiting period before consuming a high-fat meal.
Liquid thyroxine may be less affected by food/drink but is expensive
Both products have been approved by the U.S. Food and Drug Administration, but current labeling for both still calls for a 30- to 60-minute waiting period between taking the medication and eating or drinking. Thyquidity is an oral solution of 100 µg/mL levothyroxine sodium that has been shown to be bioequivalent to one of the most popular branded levothyroxine tablets, Synthroid (AbbVie), under fasting conditions. Tirosint-SOL is also an oral solution that comes in 15 different dosage ampules.
“It is important to note that while these findings are exciting and encouraging, we do want you to continue to follow the current FDA-approved label for Thyquidity, recommending that it be taken on an empty stomach 30-60 minutes prior to breakfast and that patients continue to follow all other label instructions,” Dr. Washington said during a press briefing at ENDO 2022.
When asked whether the new data would be submitted to the FDA for a possible amendment to this message, she replied: “We’re still discussing that. We’re exploring all options. ... This is fairly new data. ... It makes sense and certainly solves a lot of the challenges for people who can’t swallow or don’t choose to swallow, or the challenges of splitting or crushing with tablets.”
Asked to comment, Benjamin J. Gigliotti, MD, a clinical thyroidologist at the University of Rochester, New York, told this news organization: “Liquid levothyroxine has the potential to be a clinically useful formulation,” noting that these recent data corroborate prior findings from Europe and elsewhere that liquid levothyroxine is absorbed more rapidly and thus may be less impacted by food or beverages.
However, Dr. Gigliotti also pointed out, “I don’t think malabsorption is a major contributor to suboptimal treatment because if [patients] malabsorb the hormone, we typically just increase their dose a little bit or ask them to take it separately, and that works just fine for most people.”
And the higher cost of the liquid products is a major issue, he noted.
A quick search on GoodRx shows that the lowest price of Tirosint-SOL is $115.52 for a 1 month supply and Thyquidity is $181.04/month. “In the few patients where I tried to obtain Tirosint-SOL, it was not covered by insurance, even with a prior authorization,” Dr. Gigliotti commented.
In contrast, generic levothyroxine tablets are about $4/month, while a common brand name of levothyroxine tablets are $47.81/month.
“Until these liquid formulations are more widely covered by insurance for a reasonable copay, or come down in price compared to generic levothyroxine tablets, most of my patients have voiced that they’d rather deal with the inconveniences of a tablet compared to higher medication cost, especially with rising economic insecurity imposed by the COVID-19 pandemic and recent world events,” Dr. Gigliotti said.
Bioequivalence with shorter versus longer waits before coffee/breakfast
The Thyquidity coffee study was a single-center open-label, randomized, crossover study of 40 healthy adults randomized after a 10-hour overnight fast to 600 µg Thyquidity with water under fasting conditions or to the same dose given 5 minutes prior to drinking an 8-ounce cup of American coffee without milk or sweeteners. After a 40-day washout period, the same participants received the other treatment.
Mean serum thyroxine (T4) concentrations over 48 hours were nearly identical, demonstrating comparable bioavailability. Pharmacokinetics parameters, including area under the curve (AUC) and Cmax, were also comparable for both groups. The geometric least square mean ratios for baseline-adjusted LT4 were 96.0% for Cmax and 94% for AUC. And the corresponding 90% confidence intervals fell within the 80%-125% FDA acceptance range for absence of a food effect on bioavailability, said Dr. Washington when presenting the findings.
There was one adverse event, a decrease in blood glucose level, which was deemed to be mild and unrelated to study treatment. No deaths, serious adverse events, or discontinuations due to adverse events were reported. There were no significant changes in vital signs or on ECG.
In the second Thyquidity study of 38 healthy adults, after a 10-hour fast, the same doses were given 10 or 30 minutes prior to the consumption of a 950-calorie standardized high-fat breakfast.
Again, over 48 hours, mean serum T4 levels were comparable between the two groups. The geometric least squares mean ratios for both AUC and Cmax for baseline-adjusted LT4 were 88.7% and 85.1%, respectively. Again, the corresponding 90% confidence intervals fell within the FDA’s noninterference definition, again demonstrating lack of a food effect on bioavailability, Dr. Washington noted.
Four adverse events were reported in three participants, with three deemed to be possibly related to the medication. All were isolated lab abnormalities without clinical symptoms and deemed to be mild. Three were normal on repeat testing.
There were no deaths or serious adverse events or study discontinuations for adverse events and no significant findings for vital signs or on ECG.
Similar findings for Tirosint-SOL but longer-term studies needed
The recently published Tirosint-SOL study included 36 healthy volunteers randomized to single 600-µg doses of the LT4 oral solution after a 10-hour fast, either 15 or 30 minutes before eating a standardized high-fat, high-calorie meal. Mean serum total thyroxine concentration profiles were similar for both the 15- and 30-minute waits, with similar AUCs.
Geometric mean ratios for AUCs at 48 and 72 hours were 90% and 92%, respectively, and the 90% confidence intervals fell within the 80%-125% FDA boundaries, suggesting similar exposures whether taken 15 or 30 minutes before a meal.
Senior author Francesco S. Celi, MD, chair of the division of endocrinology, diabetes, and metabolism at Virginia Commonwealth University, Richmond, told this news organization: “There is an interest in providing more opportunities for patients and improving adherence to the medication. ... Whatever makes life a bit easier for patients and results in a more predictable response to treatment means down the road there will be fewer visits to the doctor to make adjustments.”
However, he said that in addition to the cost and reimbursement issue, all of these studies have been short term and not conducted in real-life settings.
“Another question is: What happens if the patient goes on low-dose LT4? The studies were conducted on much higher pharmacologic doses. But at least from a safety standpoint, there’s no specific concern.”
Dr. Washington is an employee of Vertice Pharma. Dr. Celi has received unrestricted research grants and worked as a consultant for IBSA. Dr. Gigliotti has reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
ATLANTA -- Liquid formulations of levothyroxine offer the possibility of allowing patients with hypothyroidism to take their medication with meals or coffee and skip the currently recommended 30- to 60-minute waiting period before doing either, new data suggest.
Because food, coffee, and certain medications can interfere with intestinal absorption of levothyroxine (also known as LT4), current guidelines recommend that the drug be taken in a fasting state, typically 30-60 minutes before breakfast. However, compliance may be difficult for some patients.
Now, a potential solution may come from new evidence that liquid levothyroxine formulations that bypass the gastric dissolution phase of absorption may mitigate the interference with food and coffee.
Findings from two bioavailability studies showing no difference in comparisons of Thyquidity (levothyroxine sodium oral solution, Vertice Pharma) with or without waiting periods before consuming coffee or a high-fat meal were presented at the annual meeting of the Endocrine Society (ENDO 2022), by Vertice Pharma Medical Director Kris Washington, PharmD.
And just last month, similar data were published in Thyroid for another levothyroxine oral solution, Tirosint-SOL (IBSA). No difference in pharmacokinetic properties were found with this product with a shorter versus a longer waiting period before consuming a high-fat meal.
Liquid thyroxine may be less affected by food/drink but is expensive
Both products have been approved by the U.S. Food and Drug Administration, but current labeling for both still calls for a 30- to 60-minute waiting period between taking the medication and eating or drinking. Thyquidity is an oral solution of 100 µg/mL levothyroxine sodium that has been shown to be bioequivalent to one of the most popular branded levothyroxine tablets, Synthroid (AbbVie), under fasting conditions. Tirosint-SOL is also an oral solution that comes in 15 different dosage ampules.
“It is important to note that while these findings are exciting and encouraging, we do want you to continue to follow the current FDA-approved label for Thyquidity, recommending that it be taken on an empty stomach 30-60 minutes prior to breakfast and that patients continue to follow all other label instructions,” Dr. Washington said during a press briefing at ENDO 2022.
When asked whether the new data would be submitted to the FDA for a possible amendment to this message, she replied: “We’re still discussing that. We’re exploring all options. ... This is fairly new data. ... It makes sense and certainly solves a lot of the challenges for people who can’t swallow or don’t choose to swallow, or the challenges of splitting or crushing with tablets.”
Asked to comment, Benjamin J. Gigliotti, MD, a clinical thyroidologist at the University of Rochester, New York, told this news organization: “Liquid levothyroxine has the potential to be a clinically useful formulation,” noting that these recent data corroborate prior findings from Europe and elsewhere that liquid levothyroxine is absorbed more rapidly and thus may be less impacted by food or beverages.
However, Dr. Gigliotti also pointed out, “I don’t think malabsorption is a major contributor to suboptimal treatment because if [patients] malabsorb the hormone, we typically just increase their dose a little bit or ask them to take it separately, and that works just fine for most people.”
And the higher cost of the liquid products is a major issue, he noted.
A quick search on GoodRx shows that the lowest price of Tirosint-SOL is $115.52 for a 1 month supply and Thyquidity is $181.04/month. “In the few patients where I tried to obtain Tirosint-SOL, it was not covered by insurance, even with a prior authorization,” Dr. Gigliotti commented.
In contrast, generic levothyroxine tablets are about $4/month, while a common brand name of levothyroxine tablets are $47.81/month.
“Until these liquid formulations are more widely covered by insurance for a reasonable copay, or come down in price compared to generic levothyroxine tablets, most of my patients have voiced that they’d rather deal with the inconveniences of a tablet compared to higher medication cost, especially with rising economic insecurity imposed by the COVID-19 pandemic and recent world events,” Dr. Gigliotti said.
Bioequivalence with shorter versus longer waits before coffee/breakfast
The Thyquidity coffee study was a single-center open-label, randomized, crossover study of 40 healthy adults randomized after a 10-hour overnight fast to 600 µg Thyquidity with water under fasting conditions or to the same dose given 5 minutes prior to drinking an 8-ounce cup of American coffee without milk or sweeteners. After a 40-day washout period, the same participants received the other treatment.
Mean serum thyroxine (T4) concentrations over 48 hours were nearly identical, demonstrating comparable bioavailability. Pharmacokinetics parameters, including area under the curve (AUC) and Cmax, were also comparable for both groups. The geometric least square mean ratios for baseline-adjusted LT4 were 96.0% for Cmax and 94% for AUC. And the corresponding 90% confidence intervals fell within the 80%-125% FDA acceptance range for absence of a food effect on bioavailability, said Dr. Washington when presenting the findings.
There was one adverse event, a decrease in blood glucose level, which was deemed to be mild and unrelated to study treatment. No deaths, serious adverse events, or discontinuations due to adverse events were reported. There were no significant changes in vital signs or on ECG.
In the second Thyquidity study of 38 healthy adults, after a 10-hour fast, the same doses were given 10 or 30 minutes prior to the consumption of a 950-calorie standardized high-fat breakfast.
Again, over 48 hours, mean serum T4 levels were comparable between the two groups. The geometric least squares mean ratios for both AUC and Cmax for baseline-adjusted LT4 were 88.7% and 85.1%, respectively. Again, the corresponding 90% confidence intervals fell within the FDA’s noninterference definition, again demonstrating lack of a food effect on bioavailability, Dr. Washington noted.
Four adverse events were reported in three participants, with three deemed to be possibly related to the medication. All were isolated lab abnormalities without clinical symptoms and deemed to be mild. Three were normal on repeat testing.
There were no deaths or serious adverse events or study discontinuations for adverse events and no significant findings for vital signs or on ECG.
Similar findings for Tirosint-SOL but longer-term studies needed
The recently published Tirosint-SOL study included 36 healthy volunteers randomized to single 600-µg doses of the LT4 oral solution after a 10-hour fast, either 15 or 30 minutes before eating a standardized high-fat, high-calorie meal. Mean serum total thyroxine concentration profiles were similar for both the 15- and 30-minute waits, with similar AUCs.
Geometric mean ratios for AUCs at 48 and 72 hours were 90% and 92%, respectively, and the 90% confidence intervals fell within the 80%-125% FDA boundaries, suggesting similar exposures whether taken 15 or 30 minutes before a meal.
Senior author Francesco S. Celi, MD, chair of the division of endocrinology, diabetes, and metabolism at Virginia Commonwealth University, Richmond, told this news organization: “There is an interest in providing more opportunities for patients and improving adherence to the medication. ... Whatever makes life a bit easier for patients and results in a more predictable response to treatment means down the road there will be fewer visits to the doctor to make adjustments.”
However, he said that in addition to the cost and reimbursement issue, all of these studies have been short term and not conducted in real-life settings.
“Another question is: What happens if the patient goes on low-dose LT4? The studies were conducted on much higher pharmacologic doses. But at least from a safety standpoint, there’s no specific concern.”
Dr. Washington is an employee of Vertice Pharma. Dr. Celi has received unrestricted research grants and worked as a consultant for IBSA. Dr. Gigliotti has reported no relevant financial relationships.
A version of this article first appeared on Medscape.com.
AT ENDO 2022
Nonhormonal drug for menopause symptoms passes phase 3 test
A phase 3 trial has associated the neurokinin-3 (NK3)–receptor inhibitor fezolinetant, an oral therapy taken once daily, with substantial control over the symptoms of menopause, according to results of the randomized SKYLIGHT 2 trial.
The nonhormonal therapy has the potential to address an important unmet need, Genevieve Neal-Perry, MD, PhD, said at the annual meeting of the Endocrine Society.
The health risks of hormone therapy (HT) have “caused quite a few women to consider whether hormone replacement is right for them, and, in addition, there are other individuals who have hormone-responsive cancers or other disorders that might prohibit them [from using HT],” Dr. Neal-Perry said.
The NK3 receptor stimulates the thermoregulatory center in the hypothalamus. By blocking the NK3 receptor, vasodilation and other downstream effects are inhibited, explained Dr. Neal-Perry. She credited relatively recent advances in understanding the mechanisms of menopausal symptoms for identifying this and other potentially targetable mediators.
SKYLIGHT 2 trial: Two phases
In the double-blind multinational phase 3 SKYLIGHT 2 trial, 484 otherwise healthy symptomatic menopausal women were randomized to 30 mg of fezolinetant, 45 mg of fezolinetant, or placebo. The 120 participating centers were in North American and Europe.
In the first phase, safety and efficacy were evaluated over 12 weeks. In a second extension phase, placebo patients were rerandomized to one of the fezolinetant study doses. Those on active therapy remained in their assigned groups. All patients were then followed for an additional 40 weeks.
The coprimary endpoints were frequency and severity of moderate to severe vasomotor symptoms as reported by patients using an electronic diary. There were several secondary endpoints, including patient-reported outcomes regarding sleep quality.
As expected from other controlled trials, placebo patients achieved about a 40% reduction in moderate to severe vasomotor symptom frequency over the first 12 weeks. Relative to placebo, symptom frequency declined more quickly and steeply on fezolinetant. By week 12, both achieved reductions of about 60%. Statistical P values for the differences in the three arms were not provided, but Dr. Neal-Perry reported they were significant.
Vasomotor severity, like frequency, is reduced
The change in vasomotor severity, which subjects in the trial rated as better or worse, was also significant. The differences in the severity curves were less, but they separated in favor of the two active treatment arms by about 2 weeks, and the curves continued to show an advantage for fezolinetant over both the first 12 weeks and then the remaining 40 weeks.
Overall, the decline in vasomotor symptom frequency remained on a persistent downward slope on both doses of fezolinetant for the full 52 weeks of the study, so that the reduction at 52 weeks was on the order of 25% greater than that seen at 12 weeks.
At 52 weeks, “you can see that individuals on placebo who were crossed over to an active treatment had a significant reduction in their hot flashes and look very much like those who were randomized to fezolinetant at the beginning of the study,” said Dr. Neal-Perry, who is chair of the department of obstetrics and gynecology at the University of North Carolina at Chapel Hill.
Other outcomes also favored fezolinetant over placebo. For example, a reduction in sleep disturbance observed at 12 weeks was sustained over the full 52 weeks of the study. The reduction in sleep symptoms appeared to be slightly greater on the higher dose, but the benefit at 52 weeks among patients after the crossover was similar on either active arm.
No serious side effects identified
There were no serious drug-related treatment-emergent adverse events in any treatment group. One patient in the placebo arm (< 1%), two patients in the 30-mg fezolinetant arm (1.2%), and five patients in the 45-mg arm (3%) discontinued therapy for an adverse event considered to be treatment related.
“The most common side effect associated with fezolinetant was headache. There were no other side effects that led patients to pull out of the study,” Dr. Neal-Perry reported at the meeting, which was held in Atlanta and virtually.
According to Dr. Neal-Perry the vasomotor symptoms relative to menopause, which occur in almost all women, are moderate to severe in an estimated 35%-45%. Some groups, such as those with an elevated body mass index and African Americans, appear to be at even greater risk. Study enrollment was specifically designed to include these high-risk groups, but the subgroup efficacy data have not yet been analyzed.
Other drugs with a similar mechanism of action have not been brought forward because of concern about elevated liver enzymes, but Dr. Neal-Perry said that this does not appear to be an issue for fezolinetant, which was designed with greater specificity for the NK3 target than previous treatments.
If fezolinetant is approved, Dr. Neal-Perry expects this agent to fulfill an important unmet need because of the limitations of other nonhormonal solutions for control of menopause symptoms.
HT alternatives limited
For control of many menopause symptoms, particularly hot flashes, hormone therapy (HT) is the most efficacious, but Richard J. Santen, MD, emeritus professor and an endocrinologist at the University of Virginia, Charlottesville, agreed there is a need for alternatives.
In addition to those who have contraindications for HT, Dr. Santen said in an interview that this option is not acceptable to others “for a variety of reasons.” The problem is that the alternatives are limited.
“The SSRI agents and gabapentin are alternative nonhormonal agents, but they have side effects and are not as effective,” he said. Hot flashes “can be a major disruptor of quality of life,” so he is intrigued with the positive results achieved with fezolinetant.
“A new drug such as reported at the Endocrine Society meeting would be an important new addition to the armamentarium,” he said.
Dr. Neal-Perry reports no conflicts of interest.
A phase 3 trial has associated the neurokinin-3 (NK3)–receptor inhibitor fezolinetant, an oral therapy taken once daily, with substantial control over the symptoms of menopause, according to results of the randomized SKYLIGHT 2 trial.
The nonhormonal therapy has the potential to address an important unmet need, Genevieve Neal-Perry, MD, PhD, said at the annual meeting of the Endocrine Society.
The health risks of hormone therapy (HT) have “caused quite a few women to consider whether hormone replacement is right for them, and, in addition, there are other individuals who have hormone-responsive cancers or other disorders that might prohibit them [from using HT],” Dr. Neal-Perry said.
The NK3 receptor stimulates the thermoregulatory center in the hypothalamus. By blocking the NK3 receptor, vasodilation and other downstream effects are inhibited, explained Dr. Neal-Perry. She credited relatively recent advances in understanding the mechanisms of menopausal symptoms for identifying this and other potentially targetable mediators.
SKYLIGHT 2 trial: Two phases
In the double-blind multinational phase 3 SKYLIGHT 2 trial, 484 otherwise healthy symptomatic menopausal women were randomized to 30 mg of fezolinetant, 45 mg of fezolinetant, or placebo. The 120 participating centers were in North American and Europe.
In the first phase, safety and efficacy were evaluated over 12 weeks. In a second extension phase, placebo patients were rerandomized to one of the fezolinetant study doses. Those on active therapy remained in their assigned groups. All patients were then followed for an additional 40 weeks.
The coprimary endpoints were frequency and severity of moderate to severe vasomotor symptoms as reported by patients using an electronic diary. There were several secondary endpoints, including patient-reported outcomes regarding sleep quality.
As expected from other controlled trials, placebo patients achieved about a 40% reduction in moderate to severe vasomotor symptom frequency over the first 12 weeks. Relative to placebo, symptom frequency declined more quickly and steeply on fezolinetant. By week 12, both achieved reductions of about 60%. Statistical P values for the differences in the three arms were not provided, but Dr. Neal-Perry reported they were significant.
Vasomotor severity, like frequency, is reduced
The change in vasomotor severity, which subjects in the trial rated as better or worse, was also significant. The differences in the severity curves were less, but they separated in favor of the two active treatment arms by about 2 weeks, and the curves continued to show an advantage for fezolinetant over both the first 12 weeks and then the remaining 40 weeks.
Overall, the decline in vasomotor symptom frequency remained on a persistent downward slope on both doses of fezolinetant for the full 52 weeks of the study, so that the reduction at 52 weeks was on the order of 25% greater than that seen at 12 weeks.
At 52 weeks, “you can see that individuals on placebo who were crossed over to an active treatment had a significant reduction in their hot flashes and look very much like those who were randomized to fezolinetant at the beginning of the study,” said Dr. Neal-Perry, who is chair of the department of obstetrics and gynecology at the University of North Carolina at Chapel Hill.
Other outcomes also favored fezolinetant over placebo. For example, a reduction in sleep disturbance observed at 12 weeks was sustained over the full 52 weeks of the study. The reduction in sleep symptoms appeared to be slightly greater on the higher dose, but the benefit at 52 weeks among patients after the crossover was similar on either active arm.
No serious side effects identified
There were no serious drug-related treatment-emergent adverse events in any treatment group. One patient in the placebo arm (< 1%), two patients in the 30-mg fezolinetant arm (1.2%), and five patients in the 45-mg arm (3%) discontinued therapy for an adverse event considered to be treatment related.
“The most common side effect associated with fezolinetant was headache. There were no other side effects that led patients to pull out of the study,” Dr. Neal-Perry reported at the meeting, which was held in Atlanta and virtually.
According to Dr. Neal-Perry the vasomotor symptoms relative to menopause, which occur in almost all women, are moderate to severe in an estimated 35%-45%. Some groups, such as those with an elevated body mass index and African Americans, appear to be at even greater risk. Study enrollment was specifically designed to include these high-risk groups, but the subgroup efficacy data have not yet been analyzed.
Other drugs with a similar mechanism of action have not been brought forward because of concern about elevated liver enzymes, but Dr. Neal-Perry said that this does not appear to be an issue for fezolinetant, which was designed with greater specificity for the NK3 target than previous treatments.
If fezolinetant is approved, Dr. Neal-Perry expects this agent to fulfill an important unmet need because of the limitations of other nonhormonal solutions for control of menopause symptoms.
HT alternatives limited
For control of many menopause symptoms, particularly hot flashes, hormone therapy (HT) is the most efficacious, but Richard J. Santen, MD, emeritus professor and an endocrinologist at the University of Virginia, Charlottesville, agreed there is a need for alternatives.
In addition to those who have contraindications for HT, Dr. Santen said in an interview that this option is not acceptable to others “for a variety of reasons.” The problem is that the alternatives are limited.
“The SSRI agents and gabapentin are alternative nonhormonal agents, but they have side effects and are not as effective,” he said. Hot flashes “can be a major disruptor of quality of life,” so he is intrigued with the positive results achieved with fezolinetant.
“A new drug such as reported at the Endocrine Society meeting would be an important new addition to the armamentarium,” he said.
Dr. Neal-Perry reports no conflicts of interest.
A phase 3 trial has associated the neurokinin-3 (NK3)–receptor inhibitor fezolinetant, an oral therapy taken once daily, with substantial control over the symptoms of menopause, according to results of the randomized SKYLIGHT 2 trial.
The nonhormonal therapy has the potential to address an important unmet need, Genevieve Neal-Perry, MD, PhD, said at the annual meeting of the Endocrine Society.
The health risks of hormone therapy (HT) have “caused quite a few women to consider whether hormone replacement is right for them, and, in addition, there are other individuals who have hormone-responsive cancers or other disorders that might prohibit them [from using HT],” Dr. Neal-Perry said.
The NK3 receptor stimulates the thermoregulatory center in the hypothalamus. By blocking the NK3 receptor, vasodilation and other downstream effects are inhibited, explained Dr. Neal-Perry. She credited relatively recent advances in understanding the mechanisms of menopausal symptoms for identifying this and other potentially targetable mediators.
SKYLIGHT 2 trial: Two phases
In the double-blind multinational phase 3 SKYLIGHT 2 trial, 484 otherwise healthy symptomatic menopausal women were randomized to 30 mg of fezolinetant, 45 mg of fezolinetant, or placebo. The 120 participating centers were in North American and Europe.
In the first phase, safety and efficacy were evaluated over 12 weeks. In a second extension phase, placebo patients were rerandomized to one of the fezolinetant study doses. Those on active therapy remained in their assigned groups. All patients were then followed for an additional 40 weeks.
The coprimary endpoints were frequency and severity of moderate to severe vasomotor symptoms as reported by patients using an electronic diary. There were several secondary endpoints, including patient-reported outcomes regarding sleep quality.
As expected from other controlled trials, placebo patients achieved about a 40% reduction in moderate to severe vasomotor symptom frequency over the first 12 weeks. Relative to placebo, symptom frequency declined more quickly and steeply on fezolinetant. By week 12, both achieved reductions of about 60%. Statistical P values for the differences in the three arms were not provided, but Dr. Neal-Perry reported they were significant.
Vasomotor severity, like frequency, is reduced
The change in vasomotor severity, which subjects in the trial rated as better or worse, was also significant. The differences in the severity curves were less, but they separated in favor of the two active treatment arms by about 2 weeks, and the curves continued to show an advantage for fezolinetant over both the first 12 weeks and then the remaining 40 weeks.
Overall, the decline in vasomotor symptom frequency remained on a persistent downward slope on both doses of fezolinetant for the full 52 weeks of the study, so that the reduction at 52 weeks was on the order of 25% greater than that seen at 12 weeks.
At 52 weeks, “you can see that individuals on placebo who were crossed over to an active treatment had a significant reduction in their hot flashes and look very much like those who were randomized to fezolinetant at the beginning of the study,” said Dr. Neal-Perry, who is chair of the department of obstetrics and gynecology at the University of North Carolina at Chapel Hill.
Other outcomes also favored fezolinetant over placebo. For example, a reduction in sleep disturbance observed at 12 weeks was sustained over the full 52 weeks of the study. The reduction in sleep symptoms appeared to be slightly greater on the higher dose, but the benefit at 52 weeks among patients after the crossover was similar on either active arm.
No serious side effects identified
There were no serious drug-related treatment-emergent adverse events in any treatment group. One patient in the placebo arm (< 1%), two patients in the 30-mg fezolinetant arm (1.2%), and five patients in the 45-mg arm (3%) discontinued therapy for an adverse event considered to be treatment related.
“The most common side effect associated with fezolinetant was headache. There were no other side effects that led patients to pull out of the study,” Dr. Neal-Perry reported at the meeting, which was held in Atlanta and virtually.
According to Dr. Neal-Perry the vasomotor symptoms relative to menopause, which occur in almost all women, are moderate to severe in an estimated 35%-45%. Some groups, such as those with an elevated body mass index and African Americans, appear to be at even greater risk. Study enrollment was specifically designed to include these high-risk groups, but the subgroup efficacy data have not yet been analyzed.
Other drugs with a similar mechanism of action have not been brought forward because of concern about elevated liver enzymes, but Dr. Neal-Perry said that this does not appear to be an issue for fezolinetant, which was designed with greater specificity for the NK3 target than previous treatments.
If fezolinetant is approved, Dr. Neal-Perry expects this agent to fulfill an important unmet need because of the limitations of other nonhormonal solutions for control of menopause symptoms.
HT alternatives limited
For control of many menopause symptoms, particularly hot flashes, hormone therapy (HT) is the most efficacious, but Richard J. Santen, MD, emeritus professor and an endocrinologist at the University of Virginia, Charlottesville, agreed there is a need for alternatives.
In addition to those who have contraindications for HT, Dr. Santen said in an interview that this option is not acceptable to others “for a variety of reasons.” The problem is that the alternatives are limited.
“The SSRI agents and gabapentin are alternative nonhormonal agents, but they have side effects and are not as effective,” he said. Hot flashes “can be a major disruptor of quality of life,” so he is intrigued with the positive results achieved with fezolinetant.
“A new drug such as reported at the Endocrine Society meeting would be an important new addition to the armamentarium,” he said.
Dr. Neal-Perry reports no conflicts of interest.
FROM ENDO 2022
Synthetic opioid use up almost 800% nationwide
The results of a national urine drug test (UDT) study come as the United States is reporting a record-high number of drug overdose deaths – more than 80% of which involved fentanyl or other synthetic opioids and prompting a push for better surveillance models.
Researchers found that UDTs can be used to accurately identify which drugs are circulating in a community, revealing in just a matter of days critically important drug use trends that current surveillance methods take a month or longer to report.
The faster turnaround could potentially allow clinicians and public health officials to be more proactive with targeted overdose prevention and harm-reduction strategies such as distribution of naloxone and fentanyl test strips.
“We’re talking about trying to come up with an early-warning system,” study author Steven Passik, PhD, vice president for scientific affairs for Millennium Health, San Diego, Calif., told this news organization. “We’re trying to find out if we can let people in the harm reduction and treatment space know about what might be coming weeks or a month or more in advance so that some interventions could be marshaled.”
The study was published online in JAMA Network Open.
Call for better surveillance
More than 100,000 people in the United States died of an unintended drug overdose in 2021, a record high and a 15% increase over 2020 figures, which also set a record.
Part of the federal government’s plan to address the crisis includes strengthening epidemiologic efforts by better collection and mining of public health surveillance data.
Sources currently used to detect drug use trends include mortality data, poison control centers, emergency departments, electronic health records, and crime laboratories. But analysis of these sources can take weeks or more.
“One of the real challenges in addressing and reducing overdose deaths has been the relative lack of accessible real-time data that can support agile responses to deployment of resources in a specific geographic region,” study coauthor Rebecca Jackson, MD, professor and associate dean for clinical and translational research at Ohio State University in Columbus, said in an interview.
Ohio State researchers partnered with scientists at Millennium Health, one of the largest urine test labs in the United States, on a cross-sectional study to find out if UDTs could be an accurate and speedier tool for drug surveillance.
They analyzed 500,000 unique urine samples from patients in substance use disorder (SUD) treatment facilities in all 50 states from 2013 to 2020, comparing levels of cocaine, heroin, methamphetamine, synthetic opioids, and other opioids found in the samples to levels of the same drugs from overdose mortality data at the national, state, and county level from the National Vital Statistics System.
On a national level, synthetic opioids and methamphetamine were highly correlated with overdose mortality data (Spearman’s rho = .96 for both). When synthetic opioids were coinvolved, methamphetamine (rho = .98), heroin (rho = .78), cocaine (rho = .94), and other opioids (rho = .83) were also highly correlated with overdose mortality data.
Similar correlations were found when examining state-level data from 24 states and at the county level upon analysis of 19 counties in Ohio.
A changing landscape
Researchers said the strong correlation between overdose deaths and UDT results for synthetic opioids and methamphetamine are likely explained by the drugs’ availability and lethality.
“The most important thing that we found was just the strength of the correlation, which goes right to the heart of why we considered correlation to be so critical,” lead author Penn Whitley, senior director of bioinformatics for Millennium Health, told this news organization. “We needed to demonstrate that there was a strong correlation of just the UDT positivity rates with mortality – in this case, fatal drug overdose rates – as a steppingstone to build out tools that could utilize UDT as a real-time data source.”
While the main goal of the study was to establish correlation between UDT results and national mortality data, the study also offers a view of a changing landscape in the opioid epidemic.
Overall, UDT positivity for total synthetic opioids increased from 2.1% in 2013 to 19.1% in 2020 (a 792.5% increase). Positivity rates for all included drug categories increased when synthetic opioids were present.
However, in the absence of synthetic opioids, UDT positivity decreased for almost all drug categories from 2013 to 2020 (from 7.7% to 4.7% for cocaine; 3.9% to 1.6% for heroin; 20.5% to 6.9% for other opioids).
Only methamphetamine positivity increased with or without involvement of synthetic opioids. With synthetic opioids, meth positivity rose from 0.1% in 2013 to 7.9% in 2020. Without them, meth positivity rates still rose, from 2.1% in 2013 to 13.1% in 2020.
The findings track with an earlier study showing methamphetamine-involved overdose deaths rose sharply between 2011 and 2018.
“The data from this manuscript support that the opioid epidemic is transitioning from an opioid epidemic to a polysubstance epidemic where illicit synthetic opioids, largely fentanyl, in combination with other substances are now responsible for upwards of 80% of OD deaths,” Dr. Jackson said.
In an accompanying editorial Jeffrey Brent, MD, PhD, clinical professor in internal medicine at the University of Colorado at Denver, Aurora, and Stephanie T. Weiss, MD, PhD, staff clinician in the Translational Addiction Medicine Branch at the National Institute on Drug Abuse, Baltimore, note that as new agents emerge, different harm-reduction strategies will be needed, adding that having a real-time tool to identify the trends will be key to preventing deaths.
“Surveillance systems are an integral component of reducing morbidity and mortality associated with illicit drug use. On local, regional, and national levels, information of this type is needed to most efficiently allocate limited resources to maximize benefit and save lives,” Dr. Brent and Dr. Weiss write.
The study was funded by Millennium Health and the National Center for Advancing Translational Sciences. Full disclosures are included in the original articles, but no sources reported conflicts related to the study.
A version of this article first appeared on Medscape.com.
The results of a national urine drug test (UDT) study come as the United States is reporting a record-high number of drug overdose deaths – more than 80% of which involved fentanyl or other synthetic opioids and prompting a push for better surveillance models.
Researchers found that UDTs can be used to accurately identify which drugs are circulating in a community, revealing in just a matter of days critically important drug use trends that current surveillance methods take a month or longer to report.
The faster turnaround could potentially allow clinicians and public health officials to be more proactive with targeted overdose prevention and harm-reduction strategies such as distribution of naloxone and fentanyl test strips.
“We’re talking about trying to come up with an early-warning system,” study author Steven Passik, PhD, vice president for scientific affairs for Millennium Health, San Diego, Calif., told this news organization. “We’re trying to find out if we can let people in the harm reduction and treatment space know about what might be coming weeks or a month or more in advance so that some interventions could be marshaled.”
The study was published online in JAMA Network Open.
Call for better surveillance
More than 100,000 people in the United States died of an unintended drug overdose in 2021, a record high and a 15% increase over 2020 figures, which also set a record.
Part of the federal government’s plan to address the crisis includes strengthening epidemiologic efforts by better collection and mining of public health surveillance data.
Sources currently used to detect drug use trends include mortality data, poison control centers, emergency departments, electronic health records, and crime laboratories. But analysis of these sources can take weeks or more.
“One of the real challenges in addressing and reducing overdose deaths has been the relative lack of accessible real-time data that can support agile responses to deployment of resources in a specific geographic region,” study coauthor Rebecca Jackson, MD, professor and associate dean for clinical and translational research at Ohio State University in Columbus, said in an interview.
Ohio State researchers partnered with scientists at Millennium Health, one of the largest urine test labs in the United States, on a cross-sectional study to find out if UDTs could be an accurate and speedier tool for drug surveillance.
They analyzed 500,000 unique urine samples from patients in substance use disorder (SUD) treatment facilities in all 50 states from 2013 to 2020, comparing levels of cocaine, heroin, methamphetamine, synthetic opioids, and other opioids found in the samples to levels of the same drugs from overdose mortality data at the national, state, and county level from the National Vital Statistics System.
On a national level, synthetic opioids and methamphetamine were highly correlated with overdose mortality data (Spearman’s rho = .96 for both). When synthetic opioids were coinvolved, methamphetamine (rho = .98), heroin (rho = .78), cocaine (rho = .94), and other opioids (rho = .83) were also highly correlated with overdose mortality data.
Similar correlations were found when examining state-level data from 24 states and at the county level upon analysis of 19 counties in Ohio.
A changing landscape
Researchers said the strong correlation between overdose deaths and UDT results for synthetic opioids and methamphetamine are likely explained by the drugs’ availability and lethality.
“The most important thing that we found was just the strength of the correlation, which goes right to the heart of why we considered correlation to be so critical,” lead author Penn Whitley, senior director of bioinformatics for Millennium Health, told this news organization. “We needed to demonstrate that there was a strong correlation of just the UDT positivity rates with mortality – in this case, fatal drug overdose rates – as a steppingstone to build out tools that could utilize UDT as a real-time data source.”
While the main goal of the study was to establish correlation between UDT results and national mortality data, the study also offers a view of a changing landscape in the opioid epidemic.
Overall, UDT positivity for total synthetic opioids increased from 2.1% in 2013 to 19.1% in 2020 (a 792.5% increase). Positivity rates for all included drug categories increased when synthetic opioids were present.
However, in the absence of synthetic opioids, UDT positivity decreased for almost all drug categories from 2013 to 2020 (from 7.7% to 4.7% for cocaine; 3.9% to 1.6% for heroin; 20.5% to 6.9% for other opioids).
Only methamphetamine positivity increased with or without involvement of synthetic opioids. With synthetic opioids, meth positivity rose from 0.1% in 2013 to 7.9% in 2020. Without them, meth positivity rates still rose, from 2.1% in 2013 to 13.1% in 2020.
The findings track with an earlier study showing methamphetamine-involved overdose deaths rose sharply between 2011 and 2018.
“The data from this manuscript support that the opioid epidemic is transitioning from an opioid epidemic to a polysubstance epidemic where illicit synthetic opioids, largely fentanyl, in combination with other substances are now responsible for upwards of 80% of OD deaths,” Dr. Jackson said.
In an accompanying editorial Jeffrey Brent, MD, PhD, clinical professor in internal medicine at the University of Colorado at Denver, Aurora, and Stephanie T. Weiss, MD, PhD, staff clinician in the Translational Addiction Medicine Branch at the National Institute on Drug Abuse, Baltimore, note that as new agents emerge, different harm-reduction strategies will be needed, adding that having a real-time tool to identify the trends will be key to preventing deaths.
“Surveillance systems are an integral component of reducing morbidity and mortality associated with illicit drug use. On local, regional, and national levels, information of this type is needed to most efficiently allocate limited resources to maximize benefit and save lives,” Dr. Brent and Dr. Weiss write.
The study was funded by Millennium Health and the National Center for Advancing Translational Sciences. Full disclosures are included in the original articles, but no sources reported conflicts related to the study.
A version of this article first appeared on Medscape.com.
The results of a national urine drug test (UDT) study come as the United States is reporting a record-high number of drug overdose deaths – more than 80% of which involved fentanyl or other synthetic opioids and prompting a push for better surveillance models.
Researchers found that UDTs can be used to accurately identify which drugs are circulating in a community, revealing in just a matter of days critically important drug use trends that current surveillance methods take a month or longer to report.
The faster turnaround could potentially allow clinicians and public health officials to be more proactive with targeted overdose prevention and harm-reduction strategies such as distribution of naloxone and fentanyl test strips.
“We’re talking about trying to come up with an early-warning system,” study author Steven Passik, PhD, vice president for scientific affairs for Millennium Health, San Diego, Calif., told this news organization. “We’re trying to find out if we can let people in the harm reduction and treatment space know about what might be coming weeks or a month or more in advance so that some interventions could be marshaled.”
The study was published online in JAMA Network Open.
Call for better surveillance
More than 100,000 people in the United States died of an unintended drug overdose in 2021, a record high and a 15% increase over 2020 figures, which also set a record.
Part of the federal government’s plan to address the crisis includes strengthening epidemiologic efforts by better collection and mining of public health surveillance data.
Sources currently used to detect drug use trends include mortality data, poison control centers, emergency departments, electronic health records, and crime laboratories. But analysis of these sources can take weeks or more.
“One of the real challenges in addressing and reducing overdose deaths has been the relative lack of accessible real-time data that can support agile responses to deployment of resources in a specific geographic region,” study coauthor Rebecca Jackson, MD, professor and associate dean for clinical and translational research at Ohio State University in Columbus, said in an interview.
Ohio State researchers partnered with scientists at Millennium Health, one of the largest urine test labs in the United States, on a cross-sectional study to find out if UDTs could be an accurate and speedier tool for drug surveillance.
They analyzed 500,000 unique urine samples from patients in substance use disorder (SUD) treatment facilities in all 50 states from 2013 to 2020, comparing levels of cocaine, heroin, methamphetamine, synthetic opioids, and other opioids found in the samples to levels of the same drugs from overdose mortality data at the national, state, and county level from the National Vital Statistics System.
On a national level, synthetic opioids and methamphetamine were highly correlated with overdose mortality data (Spearman’s rho = .96 for both). When synthetic opioids were coinvolved, methamphetamine (rho = .98), heroin (rho = .78), cocaine (rho = .94), and other opioids (rho = .83) were also highly correlated with overdose mortality data.
Similar correlations were found when examining state-level data from 24 states and at the county level upon analysis of 19 counties in Ohio.
A changing landscape
Researchers said the strong correlation between overdose deaths and UDT results for synthetic opioids and methamphetamine are likely explained by the drugs’ availability and lethality.
“The most important thing that we found was just the strength of the correlation, which goes right to the heart of why we considered correlation to be so critical,” lead author Penn Whitley, senior director of bioinformatics for Millennium Health, told this news organization. “We needed to demonstrate that there was a strong correlation of just the UDT positivity rates with mortality – in this case, fatal drug overdose rates – as a steppingstone to build out tools that could utilize UDT as a real-time data source.”
While the main goal of the study was to establish correlation between UDT results and national mortality data, the study also offers a view of a changing landscape in the opioid epidemic.
Overall, UDT positivity for total synthetic opioids increased from 2.1% in 2013 to 19.1% in 2020 (a 792.5% increase). Positivity rates for all included drug categories increased when synthetic opioids were present.
However, in the absence of synthetic opioids, UDT positivity decreased for almost all drug categories from 2013 to 2020 (from 7.7% to 4.7% for cocaine; 3.9% to 1.6% for heroin; 20.5% to 6.9% for other opioids).
Only methamphetamine positivity increased with or without involvement of synthetic opioids. With synthetic opioids, meth positivity rose from 0.1% in 2013 to 7.9% in 2020. Without them, meth positivity rates still rose, from 2.1% in 2013 to 13.1% in 2020.
The findings track with an earlier study showing methamphetamine-involved overdose deaths rose sharply between 2011 and 2018.
“The data from this manuscript support that the opioid epidemic is transitioning from an opioid epidemic to a polysubstance epidemic where illicit synthetic opioids, largely fentanyl, in combination with other substances are now responsible for upwards of 80% of OD deaths,” Dr. Jackson said.
In an accompanying editorial Jeffrey Brent, MD, PhD, clinical professor in internal medicine at the University of Colorado at Denver, Aurora, and Stephanie T. Weiss, MD, PhD, staff clinician in the Translational Addiction Medicine Branch at the National Institute on Drug Abuse, Baltimore, note that as new agents emerge, different harm-reduction strategies will be needed, adding that having a real-time tool to identify the trends will be key to preventing deaths.
“Surveillance systems are an integral component of reducing morbidity and mortality associated with illicit drug use. On local, regional, and national levels, information of this type is needed to most efficiently allocate limited resources to maximize benefit and save lives,” Dr. Brent and Dr. Weiss write.
The study was funded by Millennium Health and the National Center for Advancing Translational Sciences. Full disclosures are included in the original articles, but no sources reported conflicts related to the study.
A version of this article first appeared on Medscape.com.
Breast cancer deaths take a big dip because of new medicines
CHICAGO – Progress in breast cancer treatment over the past 2 decades has reduced expected mortality from both early-stage and metastatic disease, according to a new model that looked at 10-year distant recurrence-free survival and survival time after metastatic diagnosis, among other factors.
“There has been an accelerating influx of new treatments for breast cancer starting around 1990. We wished to ask whether and to what extent decades of metastatic treatment advances may have affected population level breast cancer mortality,” said Jennifer Lee Caswell-Jin, MD, during a presentation of the study at the annual meeting of the American Society of Clinical Oncology.
“Our models find that metastatic treatments improved population-level survival in all breast cancer subtypes since 2000 with substantial variability by subtype," said Dr. Caswell-Jin, who is a medical oncologist with Stanford (Calif.) Medicine specializing in breast cancer.
The study is based on an analysis of four models from the Cancer Intervention and Surveillance Modeling Network (CISNET). The models simulated breast cancer mortality between 2000 and 2019 factoring in the use of mammography, efficacy and dissemination of estrogen receptor (ER) and HER2-specific treatments of early-stage (stages I-III) and metastatic (stage IV or distant recurrence) disease, but also non–cancer-related mortality. The models compared overall and ER/HER2-specific breast cancer mortality rates during this period with estimated rates with no screening or treatment, and then attributed mortality reductions to screening, early-stage, or metastatic treatment.
The results were compared with three clinical trials that tested therapies in different subtypes of metastatic disease. Dr. Caswell-Jin and colleagues adjusted the analysis to reflect expected differences between clinical trial populations and the broader population by sampling simulated patients who resembled the trial population.
The investigators found that, at 71%, the biggest drop in mortality rates were for women with ER+/HER2+ breast cancer, followed by 61% for women with ER-/HER2+ breast cancer and 59% for women with ER+/HER2– breast cancer. Triple-negative breast cancer – one of the most challenging breast cancers to treat – only saw a drop of 40% during this period. About 19% of the overall reduction in breast cancer mortality were caused by treatments after metastasis.
The median survival after a diagnosis of ER+/HER2– metastatic recurrence increased from 2 years in 2000 to 3.5 years in 2019. In triple-negative breast cancer, the increase was more modest, from 1.2 years in 2000 to 1.8 years in 2019. After a diagnosis of metastatic recurrence of ER+/HER2+ breast cancer, median survival increased from 2.3 years in 2000 to 4.8 years in 2019, and for ER–/HER2+ breast cancer, from 2.2 years in 2000 to 3.9 years in 2019.
“How much metastatic treatments contributed to the overall mortality reduction varied over time depending on what therapies were entering the metastatic setting at that time and what therapies were transitioning from the metastatic to early-stage setting,” Dr. Caswell-Jin said.
The study did not include sacituzumab govitecan for metastatic triple-negative breast cancer, or trastuzumab deruxtecan and tucatinib for HER2-positive disease, which were approved after 2020. “The numbers that we cite will be better today for triple-negative breast cancer because of those two drugs. And will be even better for HER2-positive breast cancer because of those two drugs,” she said.
During the Q&A portion of the presentation, Daniel Hayes, MD, the Stuart B. Padnos Professor of Breast Cancer Research at the University of Michigan Rogel Cancer Center, Ann Arbor, asked about the potential of CISNET as an in-practice diagnostic tool.
“We’ve traditionally told patients who have metastatic disease that they will not be cured. I told two patients that on Tuesday. Can CISNET modeling let us begin to see if there is indeed now, with the improved therapies we have, a group of patients who do appear to be cured, or is that not possible?” he asked.
Perhaps, Dr. Caswell-Jin said, in a very small population of older patients with HER2-positive breast cancer that did in fact occur, but to a very small degree.
CHICAGO – Progress in breast cancer treatment over the past 2 decades has reduced expected mortality from both early-stage and metastatic disease, according to a new model that looked at 10-year distant recurrence-free survival and survival time after metastatic diagnosis, among other factors.
“There has been an accelerating influx of new treatments for breast cancer starting around 1990. We wished to ask whether and to what extent decades of metastatic treatment advances may have affected population level breast cancer mortality,” said Jennifer Lee Caswell-Jin, MD, during a presentation of the study at the annual meeting of the American Society of Clinical Oncology.
“Our models find that metastatic treatments improved population-level survival in all breast cancer subtypes since 2000 with substantial variability by subtype," said Dr. Caswell-Jin, who is a medical oncologist with Stanford (Calif.) Medicine specializing in breast cancer.
The study is based on an analysis of four models from the Cancer Intervention and Surveillance Modeling Network (CISNET). The models simulated breast cancer mortality between 2000 and 2019 factoring in the use of mammography, efficacy and dissemination of estrogen receptor (ER) and HER2-specific treatments of early-stage (stages I-III) and metastatic (stage IV or distant recurrence) disease, but also non–cancer-related mortality. The models compared overall and ER/HER2-specific breast cancer mortality rates during this period with estimated rates with no screening or treatment, and then attributed mortality reductions to screening, early-stage, or metastatic treatment.
The results were compared with three clinical trials that tested therapies in different subtypes of metastatic disease. Dr. Caswell-Jin and colleagues adjusted the analysis to reflect expected differences between clinical trial populations and the broader population by sampling simulated patients who resembled the trial population.
The investigators found that, at 71%, the biggest drop in mortality rates were for women with ER+/HER2+ breast cancer, followed by 61% for women with ER-/HER2+ breast cancer and 59% for women with ER+/HER2– breast cancer. Triple-negative breast cancer – one of the most challenging breast cancers to treat – only saw a drop of 40% during this period. About 19% of the overall reduction in breast cancer mortality were caused by treatments after metastasis.
The median survival after a diagnosis of ER+/HER2– metastatic recurrence increased from 2 years in 2000 to 3.5 years in 2019. In triple-negative breast cancer, the increase was more modest, from 1.2 years in 2000 to 1.8 years in 2019. After a diagnosis of metastatic recurrence of ER+/HER2+ breast cancer, median survival increased from 2.3 years in 2000 to 4.8 years in 2019, and for ER–/HER2+ breast cancer, from 2.2 years in 2000 to 3.9 years in 2019.
“How much metastatic treatments contributed to the overall mortality reduction varied over time depending on what therapies were entering the metastatic setting at that time and what therapies were transitioning from the metastatic to early-stage setting,” Dr. Caswell-Jin said.
The study did not include sacituzumab govitecan for metastatic triple-negative breast cancer, or trastuzumab deruxtecan and tucatinib for HER2-positive disease, which were approved after 2020. “The numbers that we cite will be better today for triple-negative breast cancer because of those two drugs. And will be even better for HER2-positive breast cancer because of those two drugs,” she said.
During the Q&A portion of the presentation, Daniel Hayes, MD, the Stuart B. Padnos Professor of Breast Cancer Research at the University of Michigan Rogel Cancer Center, Ann Arbor, asked about the potential of CISNET as an in-practice diagnostic tool.
“We’ve traditionally told patients who have metastatic disease that they will not be cured. I told two patients that on Tuesday. Can CISNET modeling let us begin to see if there is indeed now, with the improved therapies we have, a group of patients who do appear to be cured, or is that not possible?” he asked.
Perhaps, Dr. Caswell-Jin said, in a very small population of older patients with HER2-positive breast cancer that did in fact occur, but to a very small degree.
CHICAGO – Progress in breast cancer treatment over the past 2 decades has reduced expected mortality from both early-stage and metastatic disease, according to a new model that looked at 10-year distant recurrence-free survival and survival time after metastatic diagnosis, among other factors.
“There has been an accelerating influx of new treatments for breast cancer starting around 1990. We wished to ask whether and to what extent decades of metastatic treatment advances may have affected population level breast cancer mortality,” said Jennifer Lee Caswell-Jin, MD, during a presentation of the study at the annual meeting of the American Society of Clinical Oncology.
“Our models find that metastatic treatments improved population-level survival in all breast cancer subtypes since 2000 with substantial variability by subtype," said Dr. Caswell-Jin, who is a medical oncologist with Stanford (Calif.) Medicine specializing in breast cancer.
The study is based on an analysis of four models from the Cancer Intervention and Surveillance Modeling Network (CISNET). The models simulated breast cancer mortality between 2000 and 2019 factoring in the use of mammography, efficacy and dissemination of estrogen receptor (ER) and HER2-specific treatments of early-stage (stages I-III) and metastatic (stage IV or distant recurrence) disease, but also non–cancer-related mortality. The models compared overall and ER/HER2-specific breast cancer mortality rates during this period with estimated rates with no screening or treatment, and then attributed mortality reductions to screening, early-stage, or metastatic treatment.
The results were compared with three clinical trials that tested therapies in different subtypes of metastatic disease. Dr. Caswell-Jin and colleagues adjusted the analysis to reflect expected differences between clinical trial populations and the broader population by sampling simulated patients who resembled the trial population.
The investigators found that, at 71%, the biggest drop in mortality rates were for women with ER+/HER2+ breast cancer, followed by 61% for women with ER-/HER2+ breast cancer and 59% for women with ER+/HER2– breast cancer. Triple-negative breast cancer – one of the most challenging breast cancers to treat – only saw a drop of 40% during this period. About 19% of the overall reduction in breast cancer mortality were caused by treatments after metastasis.
The median survival after a diagnosis of ER+/HER2– metastatic recurrence increased from 2 years in 2000 to 3.5 years in 2019. In triple-negative breast cancer, the increase was more modest, from 1.2 years in 2000 to 1.8 years in 2019. After a diagnosis of metastatic recurrence of ER+/HER2+ breast cancer, median survival increased from 2.3 years in 2000 to 4.8 years in 2019, and for ER–/HER2+ breast cancer, from 2.2 years in 2000 to 3.9 years in 2019.
“How much metastatic treatments contributed to the overall mortality reduction varied over time depending on what therapies were entering the metastatic setting at that time and what therapies were transitioning from the metastatic to early-stage setting,” Dr. Caswell-Jin said.
The study did not include sacituzumab govitecan for metastatic triple-negative breast cancer, or trastuzumab deruxtecan and tucatinib for HER2-positive disease, which were approved after 2020. “The numbers that we cite will be better today for triple-negative breast cancer because of those two drugs. And will be even better for HER2-positive breast cancer because of those two drugs,” she said.
During the Q&A portion of the presentation, Daniel Hayes, MD, the Stuart B. Padnos Professor of Breast Cancer Research at the University of Michigan Rogel Cancer Center, Ann Arbor, asked about the potential of CISNET as an in-practice diagnostic tool.
“We’ve traditionally told patients who have metastatic disease that they will not be cured. I told two patients that on Tuesday. Can CISNET modeling let us begin to see if there is indeed now, with the improved therapies we have, a group of patients who do appear to be cured, or is that not possible?” he asked.
Perhaps, Dr. Caswell-Jin said, in a very small population of older patients with HER2-positive breast cancer that did in fact occur, but to a very small degree.
AT ASCO 2022
SGLT2 inhibitors cut AFib risk in real-word analysis
NEW ORLEANS – The case continues to grow for prioritizing a sodium-glucose transporter 2 (SGLT2) inhibitor in patients with type 2 diabetes, as real-world evidence of benefit and safety accumulates on top of the data from randomized trials that first established this class as a management pillar.
Another important effect of these agents gaining increasing currency, on top of their well-established benefits in patients with type 2 diabetes for preventing acute heart failure exacerbations and slowing progression of diabetic kidney disease, is that they cut the incidence of new-onset atrial fibrillation (AFib). That effect was confirmed in an analysis of data from about 300,000 U.S. patients included in recent Medicare records, Elisabetta Patorno, MD, reported at the annual scientific sessions of the American Diabetes Association.
But despite documentation like this, real-world evidence also continues to show limited uptake of SGLT2 inhibitors in U.S. patients with type 2 diabetes. Records from more than 1.3 million patients with type 2 diabetes managed in the Veterans Affairs Healthcare System during 2019 or 2022 documented that just 10% of these patients received an agent from this class, even though all were eligible to receive it, according to findings in a separate report at the meeting.
The AFib analysis analyzed two sets of propensity score–matched Medicare patients during 2013-2018 aged 65 years or older with type 2 diabetes and no history of AFib. One analysis focused on 80,475 matched patients who started on treatment with either an SGLT2 inhibitor or a glucagonlike peptide–1 (GLP-1) receptor agonist, and a second on 74,868 matched patients who began either an SGTL2 inhibitor or a dipeptidyl peptidase–4 (DPP4) inhibitor. In both analyses, matching involved more than 130 variables. In both pair sets, patients at baseline averaged about 72 years old, nearly two-thirds were women, about 8%-9% had heart failure, 77%-80% were on metformin, and 20%-25% were using insulin.
The study’s primary endpoint was the incidence of hospitalization for AFib, which occurred a significant 18% less often in the patients who started on an SGLT2, compared with those who started a DPP4 inhibitor during median follow-up of 6.7 months, and a significant 10% less often, compared with those starting a GLP-1 receptor agonist during a median follow-up of 6.0 months, Elisabetta Patorno, MD, DrPH, reported at the meeting. This worked out to 3.7 fewer hospitalizations for AFib per 1,000 patient-years of follow-up among the people who received an SGLT2 inhibitor, compared with a DPP4 inhibitor, and a decrease of 1.8 hospitalizations/1,000 patient-years when compared against patients in a GLP-1 receptor agonist.
Two secondary outcomes showed significantly fewer episodes of newly diagnosed AFib, and significantly fewer patients initiating AFib treatment among those who received an SGLT2 inhibitor relative to the comparator groups. In addition, these associations were consistent across subgroup analyses that divided patients by their age, sex, history of heart failure, and history of atherosclerotic cardiovascular disease.
AFib effects add to benefits
The findings “suggest that initiation of an SGLT2 inhibitor may be beneficial in older adults with type 2 diabetes who are at risk for AFib,” said Dr. Patorno, a researcher in the division of pharmacoepidemiology and pharmacoeconomics at Brigham and Women’s Hospital, Boston. “These new findings on AFib may be helpful when weighing the potential risks and benefits of various glucose-lowering drugs in older patients with type 2 diabetes.”
This new evidence follows several prior reports from other research groups of data supporting an AFib benefit from SGLT2 inhibitors. The earlier reports include a post hoc analysis of more than 17,000 patients enrolled in the DECLARE-TIMI 58 cardiovascular outcome trial of dapagliflozin (Farxiga), which showed a 19% relative decrease in the rate of incident AFib or atrial flutter events during a median 4.2 year follow-up.
Other prior reports that found a reduced incidence of AFib events linked with SGLT2 inhibitor treatment include a 2020 meta-analysis based on data from more than 38,000 patients with type 2 diabetes enrolled in any of 16 randomized, controlled trials, which found a 24% relative risk reduction. And an as-yet unpublished report from researchers at the University of Rochester (N.Y.) and their associates presented in November 2021 at the annual scientific sessions of the American Heart Association that documented a significant 24% relative risk reduction in incident AFib events linked to SGLT2 inhibitor treatment in a prospective study of 13,890 patients at several hospitals in Israel or the United States.
Evidence ‘convincing’ in totality
The accumulated evidence for a reduced incidence of AFib when patients were on treatment with an SGLT2 inhibitor are “convincing because it’s real world data that complements what we know from clinical trials,” commented Silvio E. Inzucchi, MD, professor of medicine at Yale University and director of the Yale Medicine Diabetes Center in New Haven, Conn., who was not involved with the study.
“If these drugs reduce heart failure, they may also reduce AFib. Heart failure patients easily slip into AFib,” he noted in an interview, but added that “I don’t think this explains all cases” of the reduced AFib incidence.
Dr. Patorno offered a few other possible mechanisms for the observed effect. The class may work by reducing blood pressure, weight, inflammation, and oxidative stress, mitochondrial dysfunction, atrial remodeling, and AFib susceptibility. These agents are also known to cause natriuresis and diuresis, which could reduce atrial dilation, a mechanism that again relates the AFib effect to the better documented reduction in acute heart failure exacerbations.
“With the diuretic effect, we’d expect less overload at the atrium and less dilation, and the same mechanism would reduce heart failure,” she said in an interview.
“If you reduce preload and afterload you may reduce stress on the ventricle and reduce atrial stretch, and that might have a significant effect on atrial arrhythmia,” agreed Dr. Inzucchi.
EMPRISE produces more real-world evidence
A pair of additional reports at the meeting that Dr. Patorno coauthored provided real-world evidence supporting the dramatic heart failure benefit of the SGLT2 inhibitor empagliflozin (Jardiance) in U.S. patients with type 2 diabetes, compared with alternative drug classes. The EMPRISE study used data from the Medicare, Optum Clinformatics, and MarketScan databases during the period from August 2014, when empagliflozin became available, to September 2019. The study used more than 140 variables to match patients treated with either empagliflozin or a comparator agent.
The results showed that, in an analysis of more than 130,000 matched pairs, treatment with empagliflozin was linked to a significant 30% reduction in the incidence of hospitalization for heart failure, compared with patients treated with a GLP-1 receptor agonist. Analysis of more than 116,000 matched pairs of patients showed that treatment with empagliflozin linked with a significant 29%-50% reduced rate of hospitalization for heart failure, compared with matched patients treated with a DPP4 inhibitor.
These findings “add to the pool of information” on the efficacy of agents from the SGLT2 inhibitor class, Dr. Patorno said in an interview. “We wanted to look at the full range of patients with type 2 diabetes who we see in practice,” rather than the more selected group of patients enrolled in randomized trials.
SGLT2 inhibitor use lags even when cost isn’t an issue
Despite all the accumulated evidence for efficacy and safety of the class, usage remains low, Julio A. Lamprea-Montealegre, MD, PhD, a cardiologist at the University of California, San Francisco, reported in a separate talk at the meeting. The study he presented examined records for 1,319,500 adults with type 2 diabetes managed in the VA Healthcare System during 2019 and 2020. Despite being in a system that “removes the influence of cost,” just 10% of these patients received treatment with an SGLT2 inhibitor, and 7% received treatment with a GLP-1 receptor agonist.
Notably, his analysis further showed that treatment with an SGLT2 inhibitor was especially depressed among patients with an estimated glomerular filtration rate (eGFR) of 30-44 mL/min per 1.73m2. In this subgroup, usage of a drug from this class was at two-thirds of the rate, compared with patients with an eGFR of at least 90 mL/min per 1.73m2. His findings also documented lower rates of use in patients with higher risk for atherosclerotic cardiovascular disease. Dr. Lamprea-Montealegre called this a “treatment paradox,” in which patients likely to get the most benefit from an SGLT2 inhibitor were also less likely to actually receive it.
While his findings from the VA System suggest that drug cost is not the only factor driving underuse, the high price set for the SGLT2 inhibitor drugs that all currently remain on U.S. patents is widely considered an important factor.
“There is a big problem of affordability,” said Dr. Patorno.
“SGLT2 inhibitors should probably be first-line therapy” for many patients with type 2 diabetes, said Dr. Inzucchi. “The only thing holding it back is cost,” a situation that he hopes will dramatically shift once agents from this class become generic and have substantially lower price tags.
The EMPRISE study received funding from Boehringer Ingelheim, the company that markets empagliflozin (Jardiance). Dr. Patorno had no relevant commercial disclosures. Dr. Inzucchi is an adviser to Abbott Diagnostics, Esperion Therapeutics, and vTv Therapeutics, a consultant to Merck and Pfizer, and has other relationships with AstraZeneca, Boehringer Ingelheim, Lexicon, and Novo Nordisk. Dr. Lamprea-Montealegre had received research funding from Bayer.
NEW ORLEANS – The case continues to grow for prioritizing a sodium-glucose transporter 2 (SGLT2) inhibitor in patients with type 2 diabetes, as real-world evidence of benefit and safety accumulates on top of the data from randomized trials that first established this class as a management pillar.
Another important effect of these agents gaining increasing currency, on top of their well-established benefits in patients with type 2 diabetes for preventing acute heart failure exacerbations and slowing progression of diabetic kidney disease, is that they cut the incidence of new-onset atrial fibrillation (AFib). That effect was confirmed in an analysis of data from about 300,000 U.S. patients included in recent Medicare records, Elisabetta Patorno, MD, reported at the annual scientific sessions of the American Diabetes Association.
But despite documentation like this, real-world evidence also continues to show limited uptake of SGLT2 inhibitors in U.S. patients with type 2 diabetes. Records from more than 1.3 million patients with type 2 diabetes managed in the Veterans Affairs Healthcare System during 2019 or 2022 documented that just 10% of these patients received an agent from this class, even though all were eligible to receive it, according to findings in a separate report at the meeting.
The AFib analysis analyzed two sets of propensity score–matched Medicare patients during 2013-2018 aged 65 years or older with type 2 diabetes and no history of AFib. One analysis focused on 80,475 matched patients who started on treatment with either an SGLT2 inhibitor or a glucagonlike peptide–1 (GLP-1) receptor agonist, and a second on 74,868 matched patients who began either an SGTL2 inhibitor or a dipeptidyl peptidase–4 (DPP4) inhibitor. In both analyses, matching involved more than 130 variables. In both pair sets, patients at baseline averaged about 72 years old, nearly two-thirds were women, about 8%-9% had heart failure, 77%-80% were on metformin, and 20%-25% were using insulin.
The study’s primary endpoint was the incidence of hospitalization for AFib, which occurred a significant 18% less often in the patients who started on an SGLT2, compared with those who started a DPP4 inhibitor during median follow-up of 6.7 months, and a significant 10% less often, compared with those starting a GLP-1 receptor agonist during a median follow-up of 6.0 months, Elisabetta Patorno, MD, DrPH, reported at the meeting. This worked out to 3.7 fewer hospitalizations for AFib per 1,000 patient-years of follow-up among the people who received an SGLT2 inhibitor, compared with a DPP4 inhibitor, and a decrease of 1.8 hospitalizations/1,000 patient-years when compared against patients in a GLP-1 receptor agonist.
Two secondary outcomes showed significantly fewer episodes of newly diagnosed AFib, and significantly fewer patients initiating AFib treatment among those who received an SGLT2 inhibitor relative to the comparator groups. In addition, these associations were consistent across subgroup analyses that divided patients by their age, sex, history of heart failure, and history of atherosclerotic cardiovascular disease.
AFib effects add to benefits
The findings “suggest that initiation of an SGLT2 inhibitor may be beneficial in older adults with type 2 diabetes who are at risk for AFib,” said Dr. Patorno, a researcher in the division of pharmacoepidemiology and pharmacoeconomics at Brigham and Women’s Hospital, Boston. “These new findings on AFib may be helpful when weighing the potential risks and benefits of various glucose-lowering drugs in older patients with type 2 diabetes.”
This new evidence follows several prior reports from other research groups of data supporting an AFib benefit from SGLT2 inhibitors. The earlier reports include a post hoc analysis of more than 17,000 patients enrolled in the DECLARE-TIMI 58 cardiovascular outcome trial of dapagliflozin (Farxiga), which showed a 19% relative decrease in the rate of incident AFib or atrial flutter events during a median 4.2 year follow-up.
Other prior reports that found a reduced incidence of AFib events linked with SGLT2 inhibitor treatment include a 2020 meta-analysis based on data from more than 38,000 patients with type 2 diabetes enrolled in any of 16 randomized, controlled trials, which found a 24% relative risk reduction. And an as-yet unpublished report from researchers at the University of Rochester (N.Y.) and their associates presented in November 2021 at the annual scientific sessions of the American Heart Association that documented a significant 24% relative risk reduction in incident AFib events linked to SGLT2 inhibitor treatment in a prospective study of 13,890 patients at several hospitals in Israel or the United States.
Evidence ‘convincing’ in totality
The accumulated evidence for a reduced incidence of AFib when patients were on treatment with an SGLT2 inhibitor are “convincing because it’s real world data that complements what we know from clinical trials,” commented Silvio E. Inzucchi, MD, professor of medicine at Yale University and director of the Yale Medicine Diabetes Center in New Haven, Conn., who was not involved with the study.
“If these drugs reduce heart failure, they may also reduce AFib. Heart failure patients easily slip into AFib,” he noted in an interview, but added that “I don’t think this explains all cases” of the reduced AFib incidence.
Dr. Patorno offered a few other possible mechanisms for the observed effect. The class may work by reducing blood pressure, weight, inflammation, and oxidative stress, mitochondrial dysfunction, atrial remodeling, and AFib susceptibility. These agents are also known to cause natriuresis and diuresis, which could reduce atrial dilation, a mechanism that again relates the AFib effect to the better documented reduction in acute heart failure exacerbations.
“With the diuretic effect, we’d expect less overload at the atrium and less dilation, and the same mechanism would reduce heart failure,” she said in an interview.
“If you reduce preload and afterload you may reduce stress on the ventricle and reduce atrial stretch, and that might have a significant effect on atrial arrhythmia,” agreed Dr. Inzucchi.
EMPRISE produces more real-world evidence
A pair of additional reports at the meeting that Dr. Patorno coauthored provided real-world evidence supporting the dramatic heart failure benefit of the SGLT2 inhibitor empagliflozin (Jardiance) in U.S. patients with type 2 diabetes, compared with alternative drug classes. The EMPRISE study used data from the Medicare, Optum Clinformatics, and MarketScan databases during the period from August 2014, when empagliflozin became available, to September 2019. The study used more than 140 variables to match patients treated with either empagliflozin or a comparator agent.
The results showed that, in an analysis of more than 130,000 matched pairs, treatment with empagliflozin was linked to a significant 30% reduction in the incidence of hospitalization for heart failure, compared with patients treated with a GLP-1 receptor agonist. Analysis of more than 116,000 matched pairs of patients showed that treatment with empagliflozin linked with a significant 29%-50% reduced rate of hospitalization for heart failure, compared with matched patients treated with a DPP4 inhibitor.
These findings “add to the pool of information” on the efficacy of agents from the SGLT2 inhibitor class, Dr. Patorno said in an interview. “We wanted to look at the full range of patients with type 2 diabetes who we see in practice,” rather than the more selected group of patients enrolled in randomized trials.
SGLT2 inhibitor use lags even when cost isn’t an issue
Despite all the accumulated evidence for efficacy and safety of the class, usage remains low, Julio A. Lamprea-Montealegre, MD, PhD, a cardiologist at the University of California, San Francisco, reported in a separate talk at the meeting. The study he presented examined records for 1,319,500 adults with type 2 diabetes managed in the VA Healthcare System during 2019 and 2020. Despite being in a system that “removes the influence of cost,” just 10% of these patients received treatment with an SGLT2 inhibitor, and 7% received treatment with a GLP-1 receptor agonist.
Notably, his analysis further showed that treatment with an SGLT2 inhibitor was especially depressed among patients with an estimated glomerular filtration rate (eGFR) of 30-44 mL/min per 1.73m2. In this subgroup, usage of a drug from this class was at two-thirds of the rate, compared with patients with an eGFR of at least 90 mL/min per 1.73m2. His findings also documented lower rates of use in patients with higher risk for atherosclerotic cardiovascular disease. Dr. Lamprea-Montealegre called this a “treatment paradox,” in which patients likely to get the most benefit from an SGLT2 inhibitor were also less likely to actually receive it.
While his findings from the VA System suggest that drug cost is not the only factor driving underuse, the high price set for the SGLT2 inhibitor drugs that all currently remain on U.S. patents is widely considered an important factor.
“There is a big problem of affordability,” said Dr. Patorno.
“SGLT2 inhibitors should probably be first-line therapy” for many patients with type 2 diabetes, said Dr. Inzucchi. “The only thing holding it back is cost,” a situation that he hopes will dramatically shift once agents from this class become generic and have substantially lower price tags.
The EMPRISE study received funding from Boehringer Ingelheim, the company that markets empagliflozin (Jardiance). Dr. Patorno had no relevant commercial disclosures. Dr. Inzucchi is an adviser to Abbott Diagnostics, Esperion Therapeutics, and vTv Therapeutics, a consultant to Merck and Pfizer, and has other relationships with AstraZeneca, Boehringer Ingelheim, Lexicon, and Novo Nordisk. Dr. Lamprea-Montealegre had received research funding from Bayer.
NEW ORLEANS – The case continues to grow for prioritizing a sodium-glucose transporter 2 (SGLT2) inhibitor in patients with type 2 diabetes, as real-world evidence of benefit and safety accumulates on top of the data from randomized trials that first established this class as a management pillar.
Another important effect of these agents gaining increasing currency, on top of their well-established benefits in patients with type 2 diabetes for preventing acute heart failure exacerbations and slowing progression of diabetic kidney disease, is that they cut the incidence of new-onset atrial fibrillation (AFib). That effect was confirmed in an analysis of data from about 300,000 U.S. patients included in recent Medicare records, Elisabetta Patorno, MD, reported at the annual scientific sessions of the American Diabetes Association.
But despite documentation like this, real-world evidence also continues to show limited uptake of SGLT2 inhibitors in U.S. patients with type 2 diabetes. Records from more than 1.3 million patients with type 2 diabetes managed in the Veterans Affairs Healthcare System during 2019 or 2022 documented that just 10% of these patients received an agent from this class, even though all were eligible to receive it, according to findings in a separate report at the meeting.
The AFib analysis analyzed two sets of propensity score–matched Medicare patients during 2013-2018 aged 65 years or older with type 2 diabetes and no history of AFib. One analysis focused on 80,475 matched patients who started on treatment with either an SGLT2 inhibitor or a glucagonlike peptide–1 (GLP-1) receptor agonist, and a second on 74,868 matched patients who began either an SGTL2 inhibitor or a dipeptidyl peptidase–4 (DPP4) inhibitor. In both analyses, matching involved more than 130 variables. In both pair sets, patients at baseline averaged about 72 years old, nearly two-thirds were women, about 8%-9% had heart failure, 77%-80% were on metformin, and 20%-25% were using insulin.
The study’s primary endpoint was the incidence of hospitalization for AFib, which occurred a significant 18% less often in the patients who started on an SGLT2, compared with those who started a DPP4 inhibitor during median follow-up of 6.7 months, and a significant 10% less often, compared with those starting a GLP-1 receptor agonist during a median follow-up of 6.0 months, Elisabetta Patorno, MD, DrPH, reported at the meeting. This worked out to 3.7 fewer hospitalizations for AFib per 1,000 patient-years of follow-up among the people who received an SGLT2 inhibitor, compared with a DPP4 inhibitor, and a decrease of 1.8 hospitalizations/1,000 patient-years when compared against patients in a GLP-1 receptor agonist.
Two secondary outcomes showed significantly fewer episodes of newly diagnosed AFib, and significantly fewer patients initiating AFib treatment among those who received an SGLT2 inhibitor relative to the comparator groups. In addition, these associations were consistent across subgroup analyses that divided patients by their age, sex, history of heart failure, and history of atherosclerotic cardiovascular disease.
AFib effects add to benefits
The findings “suggest that initiation of an SGLT2 inhibitor may be beneficial in older adults with type 2 diabetes who are at risk for AFib,” said Dr. Patorno, a researcher in the division of pharmacoepidemiology and pharmacoeconomics at Brigham and Women’s Hospital, Boston. “These new findings on AFib may be helpful when weighing the potential risks and benefits of various glucose-lowering drugs in older patients with type 2 diabetes.”
This new evidence follows several prior reports from other research groups of data supporting an AFib benefit from SGLT2 inhibitors. The earlier reports include a post hoc analysis of more than 17,000 patients enrolled in the DECLARE-TIMI 58 cardiovascular outcome trial of dapagliflozin (Farxiga), which showed a 19% relative decrease in the rate of incident AFib or atrial flutter events during a median 4.2 year follow-up.
Other prior reports that found a reduced incidence of AFib events linked with SGLT2 inhibitor treatment include a 2020 meta-analysis based on data from more than 38,000 patients with type 2 diabetes enrolled in any of 16 randomized, controlled trials, which found a 24% relative risk reduction. And an as-yet unpublished report from researchers at the University of Rochester (N.Y.) and their associates presented in November 2021 at the annual scientific sessions of the American Heart Association that documented a significant 24% relative risk reduction in incident AFib events linked to SGLT2 inhibitor treatment in a prospective study of 13,890 patients at several hospitals in Israel or the United States.
Evidence ‘convincing’ in totality
The accumulated evidence for a reduced incidence of AFib when patients were on treatment with an SGLT2 inhibitor are “convincing because it’s real world data that complements what we know from clinical trials,” commented Silvio E. Inzucchi, MD, professor of medicine at Yale University and director of the Yale Medicine Diabetes Center in New Haven, Conn., who was not involved with the study.
“If these drugs reduce heart failure, they may also reduce AFib. Heart failure patients easily slip into AFib,” he noted in an interview, but added that “I don’t think this explains all cases” of the reduced AFib incidence.
Dr. Patorno offered a few other possible mechanisms for the observed effect. The class may work by reducing blood pressure, weight, inflammation, and oxidative stress, mitochondrial dysfunction, atrial remodeling, and AFib susceptibility. These agents are also known to cause natriuresis and diuresis, which could reduce atrial dilation, a mechanism that again relates the AFib effect to the better documented reduction in acute heart failure exacerbations.
“With the diuretic effect, we’d expect less overload at the atrium and less dilation, and the same mechanism would reduce heart failure,” she said in an interview.
“If you reduce preload and afterload you may reduce stress on the ventricle and reduce atrial stretch, and that might have a significant effect on atrial arrhythmia,” agreed Dr. Inzucchi.
EMPRISE produces more real-world evidence
A pair of additional reports at the meeting that Dr. Patorno coauthored provided real-world evidence supporting the dramatic heart failure benefit of the SGLT2 inhibitor empagliflozin (Jardiance) in U.S. patients with type 2 diabetes, compared with alternative drug classes. The EMPRISE study used data from the Medicare, Optum Clinformatics, and MarketScan databases during the period from August 2014, when empagliflozin became available, to September 2019. The study used more than 140 variables to match patients treated with either empagliflozin or a comparator agent.
The results showed that, in an analysis of more than 130,000 matched pairs, treatment with empagliflozin was linked to a significant 30% reduction in the incidence of hospitalization for heart failure, compared with patients treated with a GLP-1 receptor agonist. Analysis of more than 116,000 matched pairs of patients showed that treatment with empagliflozin linked with a significant 29%-50% reduced rate of hospitalization for heart failure, compared with matched patients treated with a DPP4 inhibitor.
These findings “add to the pool of information” on the efficacy of agents from the SGLT2 inhibitor class, Dr. Patorno said in an interview. “We wanted to look at the full range of patients with type 2 diabetes who we see in practice,” rather than the more selected group of patients enrolled in randomized trials.
SGLT2 inhibitor use lags even when cost isn’t an issue
Despite all the accumulated evidence for efficacy and safety of the class, usage remains low, Julio A. Lamprea-Montealegre, MD, PhD, a cardiologist at the University of California, San Francisco, reported in a separate talk at the meeting. The study he presented examined records for 1,319,500 adults with type 2 diabetes managed in the VA Healthcare System during 2019 and 2020. Despite being in a system that “removes the influence of cost,” just 10% of these patients received treatment with an SGLT2 inhibitor, and 7% received treatment with a GLP-1 receptor agonist.
Notably, his analysis further showed that treatment with an SGLT2 inhibitor was especially depressed among patients with an estimated glomerular filtration rate (eGFR) of 30-44 mL/min per 1.73m2. In this subgroup, usage of a drug from this class was at two-thirds of the rate, compared with patients with an eGFR of at least 90 mL/min per 1.73m2. His findings also documented lower rates of use in patients with higher risk for atherosclerotic cardiovascular disease. Dr. Lamprea-Montealegre called this a “treatment paradox,” in which patients likely to get the most benefit from an SGLT2 inhibitor were also less likely to actually receive it.
While his findings from the VA System suggest that drug cost is not the only factor driving underuse, the high price set for the SGLT2 inhibitor drugs that all currently remain on U.S. patents is widely considered an important factor.
“There is a big problem of affordability,” said Dr. Patorno.
“SGLT2 inhibitors should probably be first-line therapy” for many patients with type 2 diabetes, said Dr. Inzucchi. “The only thing holding it back is cost,” a situation that he hopes will dramatically shift once agents from this class become generic and have substantially lower price tags.
The EMPRISE study received funding from Boehringer Ingelheim, the company that markets empagliflozin (Jardiance). Dr. Patorno had no relevant commercial disclosures. Dr. Inzucchi is an adviser to Abbott Diagnostics, Esperion Therapeutics, and vTv Therapeutics, a consultant to Merck and Pfizer, and has other relationships with AstraZeneca, Boehringer Ingelheim, Lexicon, and Novo Nordisk. Dr. Lamprea-Montealegre had received research funding from Bayer.
AT ADA 2022
Promising treatment option for incurable lung cancer described as ‘significant’
, according to researchers reporting earlier this month in Chicago at the annual meeting of the American Society of Clinical Oncology.
Advanced stage IIIA NSCLC is incurable in most patients with lung cancer, and with existing treatments only 30% of patients will live up to 5 years. In this study, neoadjuvant chemotherapy with nivolumab significantly increased the pathological complete response rate in 36.2% of patients, compared with 6.8% who received chemo alone, said study author Mariano Provencio-Pulla, MD, PhD, Instituto Investigacion Sanitaria Puerta de Hierro-Segovia de Arana, Spain. The major pathologic response (MPR) – which accounts for residual viable tumor of less than or equal to 10 – was better in the treatment group as compared with patients who received chemotherapy alone (52% vs 14%). The objective response rate (ORR) – or, the percentage of patients who had a partial or complete response to treatment – was 74% in the treatment group, compared with 48% among patients who received chemotherapy.
“In our opinion this should be the standard of care for patients,” Dr. Provencio-Pulla said during his presentation.
The ASCO treatment guidelines for stage III NSCLC, specify that some patients can receive immunotherapy for up to a year, but for resectable stage III disease, this therapy is still under investigation.
In this study, called NADIM II (NCT03838159), investigators enrolled 87 patients with resectable clinical stage IIIA disease between February 2019 and November 2021. NADIM II is an open-label, randomized, two-arm, phase 2, multicenter clinical trial. Patients had ECOG scores of 0-1 and no known EGFR/ALK alterations. Patients received either nivolumab 360 mg with paclitaxel 200 mg/m2 and carboplatin AUC5 for three cycles every 21 days as treatment before or after surgery. Patients who received a resection that left no microscopic tumor in the primary tumor bed, received adjuvant nivolumab between weeks 3 and 8 after surgery for 6 months.
At 91%, almost all patients who received the immunotherapy and chemotherapy treatment underwent surgery, compared with 69% of patients in the chemotherapy treatment group. In the treatment group, patients with pathological complete response (pCR) had higher PD-L1 tumor proportion score (TPS) scores (median 70%).
The primary endpoint was pathological complete response of 0% viable tumor cells in resected lung and lymph nodes. The major pathological response was no more than 10% viable tumor remaining. The secondary endpoints included overall response rate, toxicity profile, and potential predictive biomarkers.
The addition of neoadjuvant nivolumab to chemotherapy significantly improved pCR (odds ratio, 7.88). The safety profile was “tolerable” with a moderate increase in grade 3-4 toxicity; plus no surgery was delayed because of problems with the treatment, Dr. Provencio-Pulla said.
This study was funded by Fundación GECP. Dr. Provencio-Pulla has received funding from Bristol-Myers Squibb, the maker of Opdivo (nivolumab).
, according to researchers reporting earlier this month in Chicago at the annual meeting of the American Society of Clinical Oncology.
Advanced stage IIIA NSCLC is incurable in most patients with lung cancer, and with existing treatments only 30% of patients will live up to 5 years. In this study, neoadjuvant chemotherapy with nivolumab significantly increased the pathological complete response rate in 36.2% of patients, compared with 6.8% who received chemo alone, said study author Mariano Provencio-Pulla, MD, PhD, Instituto Investigacion Sanitaria Puerta de Hierro-Segovia de Arana, Spain. The major pathologic response (MPR) – which accounts for residual viable tumor of less than or equal to 10 – was better in the treatment group as compared with patients who received chemotherapy alone (52% vs 14%). The objective response rate (ORR) – or, the percentage of patients who had a partial or complete response to treatment – was 74% in the treatment group, compared with 48% among patients who received chemotherapy.
“In our opinion this should be the standard of care for patients,” Dr. Provencio-Pulla said during his presentation.
The ASCO treatment guidelines for stage III NSCLC, specify that some patients can receive immunotherapy for up to a year, but for resectable stage III disease, this therapy is still under investigation.
In this study, called NADIM II (NCT03838159), investigators enrolled 87 patients with resectable clinical stage IIIA disease between February 2019 and November 2021. NADIM II is an open-label, randomized, two-arm, phase 2, multicenter clinical trial. Patients had ECOG scores of 0-1 and no known EGFR/ALK alterations. Patients received either nivolumab 360 mg with paclitaxel 200 mg/m2 and carboplatin AUC5 for three cycles every 21 days as treatment before or after surgery. Patients who received a resection that left no microscopic tumor in the primary tumor bed, received adjuvant nivolumab between weeks 3 and 8 after surgery for 6 months.
At 91%, almost all patients who received the immunotherapy and chemotherapy treatment underwent surgery, compared with 69% of patients in the chemotherapy treatment group. In the treatment group, patients with pathological complete response (pCR) had higher PD-L1 tumor proportion score (TPS) scores (median 70%).
The primary endpoint was pathological complete response of 0% viable tumor cells in resected lung and lymph nodes. The major pathological response was no more than 10% viable tumor remaining. The secondary endpoints included overall response rate, toxicity profile, and potential predictive biomarkers.
The addition of neoadjuvant nivolumab to chemotherapy significantly improved pCR (odds ratio, 7.88). The safety profile was “tolerable” with a moderate increase in grade 3-4 toxicity; plus no surgery was delayed because of problems with the treatment, Dr. Provencio-Pulla said.
This study was funded by Fundación GECP. Dr. Provencio-Pulla has received funding from Bristol-Myers Squibb, the maker of Opdivo (nivolumab).
, according to researchers reporting earlier this month in Chicago at the annual meeting of the American Society of Clinical Oncology.
Advanced stage IIIA NSCLC is incurable in most patients with lung cancer, and with existing treatments only 30% of patients will live up to 5 years. In this study, neoadjuvant chemotherapy with nivolumab significantly increased the pathological complete response rate in 36.2% of patients, compared with 6.8% who received chemo alone, said study author Mariano Provencio-Pulla, MD, PhD, Instituto Investigacion Sanitaria Puerta de Hierro-Segovia de Arana, Spain. The major pathologic response (MPR) – which accounts for residual viable tumor of less than or equal to 10 – was better in the treatment group as compared with patients who received chemotherapy alone (52% vs 14%). The objective response rate (ORR) – or, the percentage of patients who had a partial or complete response to treatment – was 74% in the treatment group, compared with 48% among patients who received chemotherapy.
“In our opinion this should be the standard of care for patients,” Dr. Provencio-Pulla said during his presentation.
The ASCO treatment guidelines for stage III NSCLC, specify that some patients can receive immunotherapy for up to a year, but for resectable stage III disease, this therapy is still under investigation.
In this study, called NADIM II (NCT03838159), investigators enrolled 87 patients with resectable clinical stage IIIA disease between February 2019 and November 2021. NADIM II is an open-label, randomized, two-arm, phase 2, multicenter clinical trial. Patients had ECOG scores of 0-1 and no known EGFR/ALK alterations. Patients received either nivolumab 360 mg with paclitaxel 200 mg/m2 and carboplatin AUC5 for three cycles every 21 days as treatment before or after surgery. Patients who received a resection that left no microscopic tumor in the primary tumor bed, received adjuvant nivolumab between weeks 3 and 8 after surgery for 6 months.
At 91%, almost all patients who received the immunotherapy and chemotherapy treatment underwent surgery, compared with 69% of patients in the chemotherapy treatment group. In the treatment group, patients with pathological complete response (pCR) had higher PD-L1 tumor proportion score (TPS) scores (median 70%).
The primary endpoint was pathological complete response of 0% viable tumor cells in resected lung and lymph nodes. The major pathological response was no more than 10% viable tumor remaining. The secondary endpoints included overall response rate, toxicity profile, and potential predictive biomarkers.
The addition of neoadjuvant nivolumab to chemotherapy significantly improved pCR (odds ratio, 7.88). The safety profile was “tolerable” with a moderate increase in grade 3-4 toxicity; plus no surgery was delayed because of problems with the treatment, Dr. Provencio-Pulla said.
This study was funded by Fundación GECP. Dr. Provencio-Pulla has received funding from Bristol-Myers Squibb, the maker of Opdivo (nivolumab).
FROM ASCO 2022
Opioid use in the elderly a dementia risk factor?
in new findings that suggest exposure to these drugs may be another modifiable risk factor for dementia.
“Clinicians and others may want to consider that opioid exposure in those aged 75-80 increases dementia risk, and to balance the potential benefits of opioid use in old age with adverse side effects,” said Stephen Z. Levine, PhD, professor, department of community mental health, University of Haifa (Israel).
The study was published online in the American Journal of Geriatric Psychiatry.
Widespread use
Evidence points to a relatively high rate of opioid prescriptions among older adults. A Morbidity and Mortality Weekly Report noted 19.2% of the U.S. adult population filled an opioid prescription in 2018, with the rate in those over 65 double that of adults aged 20-24 years (25% vs. 11.2%).
Disorders and illnesses for which opioids might be prescribed, including cancer and some pain conditions, “are far more prevalent in old age than at a younger age,” said Dr. Levine.
This high rate of opioid use underscores the need to consider the risks of opioid use in old age, said Dr. Levine. “Unfortunately, studies of the association between opioid use and dementia risk in old age are few, and their results are inconsistent.”
The study included 91,307 Israeli citizens aged 60 and over without dementia who were enrolled in the Meuhedet Healthcare Services, a nonprofit health maintenance organization (HMO) serving 14% of the country’s population. Meuhedet has maintained an up-to-date dementia registry since 2002.
The average age of the study sample was 68.29 years at the start of the study (in 2012).
In Israel, opioids are prescribed for a 30-day period. In this study, opioid exposure was defined as opioid medication fills covering 60 days (or two prescriptions) within a 120-day interval.
The primary outcome was incident dementia during follow-up from Jan. 1, 2013 to Oct. 30, 2017. The analysis controlled for a number of factors, including age, sex, smoking status, health conditions such as arthritis, depression, diabetes, osteoporosis, cognitive decline, vitamin deficiencies, cancer, cardiovascular conditions, and hospitalizations for falls.
Researchers also accounted for the competing risk of mortality.
During the study, 3.1% of subjects were exposed to opioids at a mean age of 73.94 years, and 5.8% of subjects developed dementia at an average age of 78.07 years.
Increased dementia risk
The risk of incident dementia was significantly increased in those exposed to opioids versus unexposed individuals in the 75- to 80-year age group (adjusted hazard ratio, 1.39; 95% confidence interval, 1.01-1.92; z statistic = 2.02; P < .05).
The authors noted the effect size for opioid exposure in this elderly age group is like other potentially modifiable risk factors for dementia, including body mass index and smoking.
The current study could not determine the biological explanation for the increased dementia risk among older opioid users. “Causal notions are challenging in observational studies and should be viewed with caution,” Dr. Levine noted.
However, a plausible mechanism highlighted in the literature is that opioids promote apoptosis of microglia and neurons that contribute to neurodegenerative diseases, he said.
The study included 14 sensitivity analyses, including those that looked at females, subjects older than 70, smokers, and groups with and without comorbid health conditions. The only sensitivity analysis that didn’t have similar findings to the primary analysis looked at dementia risk restricted to subjects without a vitamin deficiency.
“It’s reassuring that 13 or 14 sensitivity analyses found a significant association between opioid exposure and dementia risk,” said Dr. Levine.
Some prior studies did not show an association between opioid exposure and dementia risk. One possible reason for the discrepancy with the current findings is that the previous research didn’t account for age-specific opioid use effects, or the competing risk of mortality, said Dr. Levine.
Clinicians have a number of potential alternatives to opioids to treat various conditions including acetaminophen, non-steroidal anti-inflammatory drugs, amine reuptake inhibitors (ARIs), membrane stabilizers, muscle relaxants, topical capsaicin, botulinum toxin, cannabinoids, and steroids.
A limitation of the study was that it didn’t adjust for all possible comorbid health conditions, including vascular conditions, or for use of benzodiazepines, and surgical procedures.
In addition, since up to 50% of dementia cases are undetected, it’s possible some in the unexposed opioid group may actually have undiagnosed dementia, thereby reducing the effect sizes in the results.
Reverse causality is also a possibility as the neuropathological process associated with dementia could have started prior to opioid exposure. In addition, the results are limited to prolonged opioid exposure.
Interpret with caution
Commenting on the study, David Knopman, MD, a neurologist at Mayo Clinic in Rochester, Minn., whose research involves late-life cognitive disorders, was skeptical.
“On the face of it, the fact that an association was seen only in one narrow age range – 75+ to 80 years – ought to raise serious suspicion about the reliability and validity of the claim that opioid use is a risk factor for dementia, he said.
Although the researchers performed several sensitivity analyses, including accounting for mortality, “pharmacoepidemiological studies are terribly sensitive to residual biases” related to physician and patient choices related to medication use, added Dr. Knopman.
The claim that opioids are a dementia risk “should be viewed with great caution” and should not influence use of opioids where they’re truly indicated, he said.
“It would be a great pity if patients with pain requiring opioids avoid them because of fears about dementia based on the dubious relationship between age and opioid use.”
Dr. Levine and Dr. Knopman report no relevant financial disclosures.
A version of this article first appeared on Medscape.com.
in new findings that suggest exposure to these drugs may be another modifiable risk factor for dementia.
“Clinicians and others may want to consider that opioid exposure in those aged 75-80 increases dementia risk, and to balance the potential benefits of opioid use in old age with adverse side effects,” said Stephen Z. Levine, PhD, professor, department of community mental health, University of Haifa (Israel).
The study was published online in the American Journal of Geriatric Psychiatry.
Widespread use
Evidence points to a relatively high rate of opioid prescriptions among older adults. A Morbidity and Mortality Weekly Report noted 19.2% of the U.S. adult population filled an opioid prescription in 2018, with the rate in those over 65 double that of adults aged 20-24 years (25% vs. 11.2%).
Disorders and illnesses for which opioids might be prescribed, including cancer and some pain conditions, “are far more prevalent in old age than at a younger age,” said Dr. Levine.
This high rate of opioid use underscores the need to consider the risks of opioid use in old age, said Dr. Levine. “Unfortunately, studies of the association between opioid use and dementia risk in old age are few, and their results are inconsistent.”
The study included 91,307 Israeli citizens aged 60 and over without dementia who were enrolled in the Meuhedet Healthcare Services, a nonprofit health maintenance organization (HMO) serving 14% of the country’s population. Meuhedet has maintained an up-to-date dementia registry since 2002.
The average age of the study sample was 68.29 years at the start of the study (in 2012).
In Israel, opioids are prescribed for a 30-day period. In this study, opioid exposure was defined as opioid medication fills covering 60 days (or two prescriptions) within a 120-day interval.
The primary outcome was incident dementia during follow-up from Jan. 1, 2013 to Oct. 30, 2017. The analysis controlled for a number of factors, including age, sex, smoking status, health conditions such as arthritis, depression, diabetes, osteoporosis, cognitive decline, vitamin deficiencies, cancer, cardiovascular conditions, and hospitalizations for falls.
Researchers also accounted for the competing risk of mortality.
During the study, 3.1% of subjects were exposed to opioids at a mean age of 73.94 years, and 5.8% of subjects developed dementia at an average age of 78.07 years.
Increased dementia risk
The risk of incident dementia was significantly increased in those exposed to opioids versus unexposed individuals in the 75- to 80-year age group (adjusted hazard ratio, 1.39; 95% confidence interval, 1.01-1.92; z statistic = 2.02; P < .05).
The authors noted the effect size for opioid exposure in this elderly age group is like other potentially modifiable risk factors for dementia, including body mass index and smoking.
The current study could not determine the biological explanation for the increased dementia risk among older opioid users. “Causal notions are challenging in observational studies and should be viewed with caution,” Dr. Levine noted.
However, a plausible mechanism highlighted in the literature is that opioids promote apoptosis of microglia and neurons that contribute to neurodegenerative diseases, he said.
The study included 14 sensitivity analyses, including those that looked at females, subjects older than 70, smokers, and groups with and without comorbid health conditions. The only sensitivity analysis that didn’t have similar findings to the primary analysis looked at dementia risk restricted to subjects without a vitamin deficiency.
“It’s reassuring that 13 or 14 sensitivity analyses found a significant association between opioid exposure and dementia risk,” said Dr. Levine.
Some prior studies did not show an association between opioid exposure and dementia risk. One possible reason for the discrepancy with the current findings is that the previous research didn’t account for age-specific opioid use effects, or the competing risk of mortality, said Dr. Levine.
Clinicians have a number of potential alternatives to opioids to treat various conditions including acetaminophen, non-steroidal anti-inflammatory drugs, amine reuptake inhibitors (ARIs), membrane stabilizers, muscle relaxants, topical capsaicin, botulinum toxin, cannabinoids, and steroids.
A limitation of the study was that it didn’t adjust for all possible comorbid health conditions, including vascular conditions, or for use of benzodiazepines, and surgical procedures.
In addition, since up to 50% of dementia cases are undetected, it’s possible some in the unexposed opioid group may actually have undiagnosed dementia, thereby reducing the effect sizes in the results.
Reverse causality is also a possibility as the neuropathological process associated with dementia could have started prior to opioid exposure. In addition, the results are limited to prolonged opioid exposure.
Interpret with caution
Commenting on the study, David Knopman, MD, a neurologist at Mayo Clinic in Rochester, Minn., whose research involves late-life cognitive disorders, was skeptical.
“On the face of it, the fact that an association was seen only in one narrow age range – 75+ to 80 years – ought to raise serious suspicion about the reliability and validity of the claim that opioid use is a risk factor for dementia, he said.
Although the researchers performed several sensitivity analyses, including accounting for mortality, “pharmacoepidemiological studies are terribly sensitive to residual biases” related to physician and patient choices related to medication use, added Dr. Knopman.
The claim that opioids are a dementia risk “should be viewed with great caution” and should not influence use of opioids where they’re truly indicated, he said.
“It would be a great pity if patients with pain requiring opioids avoid them because of fears about dementia based on the dubious relationship between age and opioid use.”
Dr. Levine and Dr. Knopman report no relevant financial disclosures.
A version of this article first appeared on Medscape.com.
in new findings that suggest exposure to these drugs may be another modifiable risk factor for dementia.
“Clinicians and others may want to consider that opioid exposure in those aged 75-80 increases dementia risk, and to balance the potential benefits of opioid use in old age with adverse side effects,” said Stephen Z. Levine, PhD, professor, department of community mental health, University of Haifa (Israel).
The study was published online in the American Journal of Geriatric Psychiatry.
Widespread use
Evidence points to a relatively high rate of opioid prescriptions among older adults. A Morbidity and Mortality Weekly Report noted 19.2% of the U.S. adult population filled an opioid prescription in 2018, with the rate in those over 65 double that of adults aged 20-24 years (25% vs. 11.2%).
Disorders and illnesses for which opioids might be prescribed, including cancer and some pain conditions, “are far more prevalent in old age than at a younger age,” said Dr. Levine.
This high rate of opioid use underscores the need to consider the risks of opioid use in old age, said Dr. Levine. “Unfortunately, studies of the association between opioid use and dementia risk in old age are few, and their results are inconsistent.”
The study included 91,307 Israeli citizens aged 60 and over without dementia who were enrolled in the Meuhedet Healthcare Services, a nonprofit health maintenance organization (HMO) serving 14% of the country’s population. Meuhedet has maintained an up-to-date dementia registry since 2002.
The average age of the study sample was 68.29 years at the start of the study (in 2012).
In Israel, opioids are prescribed for a 30-day period. In this study, opioid exposure was defined as opioid medication fills covering 60 days (or two prescriptions) within a 120-day interval.
The primary outcome was incident dementia during follow-up from Jan. 1, 2013 to Oct. 30, 2017. The analysis controlled for a number of factors, including age, sex, smoking status, health conditions such as arthritis, depression, diabetes, osteoporosis, cognitive decline, vitamin deficiencies, cancer, cardiovascular conditions, and hospitalizations for falls.
Researchers also accounted for the competing risk of mortality.
During the study, 3.1% of subjects were exposed to opioids at a mean age of 73.94 years, and 5.8% of subjects developed dementia at an average age of 78.07 years.
Increased dementia risk
The risk of incident dementia was significantly increased in those exposed to opioids versus unexposed individuals in the 75- to 80-year age group (adjusted hazard ratio, 1.39; 95% confidence interval, 1.01-1.92; z statistic = 2.02; P < .05).
The authors noted the effect size for opioid exposure in this elderly age group is like other potentially modifiable risk factors for dementia, including body mass index and smoking.
The current study could not determine the biological explanation for the increased dementia risk among older opioid users. “Causal notions are challenging in observational studies and should be viewed with caution,” Dr. Levine noted.
However, a plausible mechanism highlighted in the literature is that opioids promote apoptosis of microglia and neurons that contribute to neurodegenerative diseases, he said.
The study included 14 sensitivity analyses, including those that looked at females, subjects older than 70, smokers, and groups with and without comorbid health conditions. The only sensitivity analysis that didn’t have similar findings to the primary analysis looked at dementia risk restricted to subjects without a vitamin deficiency.
“It’s reassuring that 13 or 14 sensitivity analyses found a significant association between opioid exposure and dementia risk,” said Dr. Levine.
Some prior studies did not show an association between opioid exposure and dementia risk. One possible reason for the discrepancy with the current findings is that the previous research didn’t account for age-specific opioid use effects, or the competing risk of mortality, said Dr. Levine.
Clinicians have a number of potential alternatives to opioids to treat various conditions including acetaminophen, non-steroidal anti-inflammatory drugs, amine reuptake inhibitors (ARIs), membrane stabilizers, muscle relaxants, topical capsaicin, botulinum toxin, cannabinoids, and steroids.
A limitation of the study was that it didn’t adjust for all possible comorbid health conditions, including vascular conditions, or for use of benzodiazepines, and surgical procedures.
In addition, since up to 50% of dementia cases are undetected, it’s possible some in the unexposed opioid group may actually have undiagnosed dementia, thereby reducing the effect sizes in the results.
Reverse causality is also a possibility as the neuropathological process associated with dementia could have started prior to opioid exposure. In addition, the results are limited to prolonged opioid exposure.
Interpret with caution
Commenting on the study, David Knopman, MD, a neurologist at Mayo Clinic in Rochester, Minn., whose research involves late-life cognitive disorders, was skeptical.
“On the face of it, the fact that an association was seen only in one narrow age range – 75+ to 80 years – ought to raise serious suspicion about the reliability and validity of the claim that opioid use is a risk factor for dementia, he said.
Although the researchers performed several sensitivity analyses, including accounting for mortality, “pharmacoepidemiological studies are terribly sensitive to residual biases” related to physician and patient choices related to medication use, added Dr. Knopman.
The claim that opioids are a dementia risk “should be viewed with great caution” and should not influence use of opioids where they’re truly indicated, he said.
“It would be a great pity if patients with pain requiring opioids avoid them because of fears about dementia based on the dubious relationship between age and opioid use.”
Dr. Levine and Dr. Knopman report no relevant financial disclosures.
A version of this article first appeared on Medscape.com.
FROM AMERICAN JOURNAL OF GERIATRIC PSYCHIATRY
Genetic testing for best antidepressant accurate, cost effective
new research suggests.
CYP2D6 and CYP2C19, from the cytochrome P450 family, are involved in the metabolism and elimination of various molecules, including medications. Variants in the genes encoding these enzymes affect the speed at which drugs are metabolized, altering their pharmacokinetic profiles.
The researchers studied 125 patients with MDD and used CYP2D6 and CYP2C19 genotyping to determine the presence of actionable phenotypes in line with Food and Drug Administration labeling.
They found that, in many cases, pharmacogenetic testing could have predicted poor response to the initial treatment selection and could have helped guide subsequent choices to improve outcomes.
In addition, a pharmacoeconomic evaluation that combined direct and indirect costs resulting from MDD with the prevalence of CYP2D6 and CYP2C19 phenotypes showed that testing for functional variants in both genes would be cost effective at a national level.
Had psychiatrists who treated patients in the study known about their metabolizing profiles, it “might have contributed to switches in medication” and could have reduced “delays in response,” said lead researcher Alessio Squassina, PhD, associate professor of pharmacology at the University of Cagliari (Italy).
The findings were presented at the European Psychiatric Association 2022 Congress.
Highly variable response rates
Dr. Squassina noted that the response to antidepressants is a “highly variable trait,” and while it is known that genetics play a role, their contribution is “still not completely understood.”
He explained that the use of pharmacogenetics, which leverages genetic information to guide treatment decision-making, has increased significantly.
While regulatory bodies, including the FDA, have been “very active” in defining strict criteria for interpreting the information from pharmacogenetic tests, there remains some “discrepancy” in their clinical utility.
Dr. Squassina said the FDA provides guidance on use of genetic testing on the labels of 34 psychiatric medications. Of these, 79% relate to CYP2D6, 12% relate to CYP2C19, and 9% relate to other genes.
These labels provide guidance on when genetic testing is recommended or required, as well as potentially clinically actionable gene-drug associations in patients with certain functional alleles.
However, Dr. Squassina noted that the distribution of such alleles is not the same across Europe, so it’s possible that a psychiatrist in Italy may be less likely to treat a patient with a phenotype affecting response to treatment or risk of adverse events than one in Norway or Sweden.
For the study, the investigators examined the frequency of CYP2D6 and CYP2C19 phenotypes in psychiatric patients in Sardinia and their relationship with pharmacologic treatment and cost-effectiveness.
They set out to recruit 200 patients with MDD who had a documented 5-year medical and treatment history, including alterations in treatment, adverse events, hospitalizations, suicide, and symptom scores, as well as sociodemographic variables.
An interim analysis of the first 125 patients recruited to the study showed that the most common CYP2D6 phenotype was normal metabolizers (NM), at 60.5%, followed by intermediate metabolizers (IM), at 28.2%, ultrarapid metabolizers (UR), at 8.9%, and poor metabolizers (PM), at 2.4%.
For CYP2C19, the most common phenotype was NM (49%), followed by IM (29.0%), UR (25.0%), and PM (4.0%). While there were differences in the overall European averages, they were not significant.
To highlight the potential impact that pharmacogenetic testing could have had on patient care and outcome, Dr. Squassina highlighted two cases.
The first concerned a patient with a CYP2D6 IM and CYP2C19 UR phenotype, who did not respond to escitalopram. The FDA drug label indicates this phenotype is actionable and recommends an alternative drug.
The patient was subsequently switched to venlafaxine. The FDA drug label on venlafaxine notes that patients with this phenotype are likely to have a suboptimal response to this drug, and again, this patient did not respond to treatment.
Another patient with a CYP2D6 NM and CYP2C19 IM phenotype was also prescribed escitalopram. The FDA label on this drug notes that patients with this phenotype can try venlafaxine but may not respond. Indeed, this patient did not respond and was switched to venlafaxine and started responding.
“The psychiatrists [in these cases] may made have made different [drug] choices if they had known the genotypes in advance,” Dr. Squassina said.
Cost effective?
To determine the cost-effectiveness of screening for CYP2D6 and CYP2C19 phenotypes in patients with MDD, the researchers used real-world data to develop a Markov model with a hypothetical cohort of 2000 MDD patients, half of whom underwent pharmacogenetic testing, to determine the potential impact on outcomes over an 18-week period.
The model included the cost of medications and hospitalization, psychiatric counseling, loss of productivity, and the estimated probability of response and adverse events, adjusted for the patient’s likelihood of having a particular metabolizing phenotype.
Results showed that, for CYP2C19, compared to no testing, pharmacogenetic testing would be cost-effective at an incremental cost-effective ratio (ICER) of €60,000 ($64,000 USD) per quality-adjusted life-year (QALY).
This, Squassina said, is “below the willingness to pay threshold” for health authorities in developed countries.
For CYP2D6, pharmacogenetic testing would become cost-effective at an ICER of approximately €47,000 ($40,000 USD) per QALY.
The team plans to complete recruitment and perform a “detailed evaluation of all the variables, especially those relating to the medication history and changes in dosage, and adverse drug reactions.” The researchers would also like to study genetic phenotypes for other metabolizing enzymes and repeat the pharmacoeconomic analysis with the complete dataset.
A glimpse into the future
Approached for comment, Alessandro Serretti, MD, PhD, department of biomedical and neuromotor sciences, University of Bologna (Italy), who was not involved in the study, said the findings show there is a “small but evident benefit” from CYP profiling, “which makes sense.”
He added that in the Netherlands and other European countries, efforts are already underway to record the CYP status of patients at a national level. “Sooner or later, all Western countries will implement it as a routine,” he said in an interview.
He explained that, when such testing is widely available, electronic health record data will allow physicians to immediately select the optimal antidepressant for an individual patient. This will end the current trial-and-error process that leads to delayed treatment and will help avoid serious consequences, such as suicide.
While reducing a single patient’s treatment by a few weeks with the most appropriate antidepressant choice does not make a large difference in the cost per episode, at a population level, it has the potential to make a significant difference.
Dr. Serretti does not envisage genotyping all 333 million Europeans for the CYP phenotype at this point but imagines that in the future, individuals will undergo whole-genome sequencing to determine risks for cancer, dementia, and heart disease, at which point they will also undergo CYP functional allele profiling, and all these data will be recorded on individuals’ EHR.
“So, every doctor, a psychiatrist or cardiologist, can see everything, whenever they need it,” he said.
The study was funded by Fondazione di Sardegna and Regione Autonoma della Sardegna. The authors disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
new research suggests.
CYP2D6 and CYP2C19, from the cytochrome P450 family, are involved in the metabolism and elimination of various molecules, including medications. Variants in the genes encoding these enzymes affect the speed at which drugs are metabolized, altering their pharmacokinetic profiles.
The researchers studied 125 patients with MDD and used CYP2D6 and CYP2C19 genotyping to determine the presence of actionable phenotypes in line with Food and Drug Administration labeling.
They found that, in many cases, pharmacogenetic testing could have predicted poor response to the initial treatment selection and could have helped guide subsequent choices to improve outcomes.
In addition, a pharmacoeconomic evaluation that combined direct and indirect costs resulting from MDD with the prevalence of CYP2D6 and CYP2C19 phenotypes showed that testing for functional variants in both genes would be cost effective at a national level.
Had psychiatrists who treated patients in the study known about their metabolizing profiles, it “might have contributed to switches in medication” and could have reduced “delays in response,” said lead researcher Alessio Squassina, PhD, associate professor of pharmacology at the University of Cagliari (Italy).
The findings were presented at the European Psychiatric Association 2022 Congress.
Highly variable response rates
Dr. Squassina noted that the response to antidepressants is a “highly variable trait,” and while it is known that genetics play a role, their contribution is “still not completely understood.”
He explained that the use of pharmacogenetics, which leverages genetic information to guide treatment decision-making, has increased significantly.
While regulatory bodies, including the FDA, have been “very active” in defining strict criteria for interpreting the information from pharmacogenetic tests, there remains some “discrepancy” in their clinical utility.
Dr. Squassina said the FDA provides guidance on use of genetic testing on the labels of 34 psychiatric medications. Of these, 79% relate to CYP2D6, 12% relate to CYP2C19, and 9% relate to other genes.
These labels provide guidance on when genetic testing is recommended or required, as well as potentially clinically actionable gene-drug associations in patients with certain functional alleles.
However, Dr. Squassina noted that the distribution of such alleles is not the same across Europe, so it’s possible that a psychiatrist in Italy may be less likely to treat a patient with a phenotype affecting response to treatment or risk of adverse events than one in Norway or Sweden.
For the study, the investigators examined the frequency of CYP2D6 and CYP2C19 phenotypes in psychiatric patients in Sardinia and their relationship with pharmacologic treatment and cost-effectiveness.
They set out to recruit 200 patients with MDD who had a documented 5-year medical and treatment history, including alterations in treatment, adverse events, hospitalizations, suicide, and symptom scores, as well as sociodemographic variables.
An interim analysis of the first 125 patients recruited to the study showed that the most common CYP2D6 phenotype was normal metabolizers (NM), at 60.5%, followed by intermediate metabolizers (IM), at 28.2%, ultrarapid metabolizers (UR), at 8.9%, and poor metabolizers (PM), at 2.4%.
For CYP2C19, the most common phenotype was NM (49%), followed by IM (29.0%), UR (25.0%), and PM (4.0%). While there were differences in the overall European averages, they were not significant.
To highlight the potential impact that pharmacogenetic testing could have had on patient care and outcome, Dr. Squassina highlighted two cases.
The first concerned a patient with a CYP2D6 IM and CYP2C19 UR phenotype, who did not respond to escitalopram. The FDA drug label indicates this phenotype is actionable and recommends an alternative drug.
The patient was subsequently switched to venlafaxine. The FDA drug label on venlafaxine notes that patients with this phenotype are likely to have a suboptimal response to this drug, and again, this patient did not respond to treatment.
Another patient with a CYP2D6 NM and CYP2C19 IM phenotype was also prescribed escitalopram. The FDA label on this drug notes that patients with this phenotype can try venlafaxine but may not respond. Indeed, this patient did not respond and was switched to venlafaxine and started responding.
“The psychiatrists [in these cases] may made have made different [drug] choices if they had known the genotypes in advance,” Dr. Squassina said.
Cost effective?
To determine the cost-effectiveness of screening for CYP2D6 and CYP2C19 phenotypes in patients with MDD, the researchers used real-world data to develop a Markov model with a hypothetical cohort of 2000 MDD patients, half of whom underwent pharmacogenetic testing, to determine the potential impact on outcomes over an 18-week period.
The model included the cost of medications and hospitalization, psychiatric counseling, loss of productivity, and the estimated probability of response and adverse events, adjusted for the patient’s likelihood of having a particular metabolizing phenotype.
Results showed that, for CYP2C19, compared to no testing, pharmacogenetic testing would be cost-effective at an incremental cost-effective ratio (ICER) of €60,000 ($64,000 USD) per quality-adjusted life-year (QALY).
This, Squassina said, is “below the willingness to pay threshold” for health authorities in developed countries.
For CYP2D6, pharmacogenetic testing would become cost-effective at an ICER of approximately €47,000 ($40,000 USD) per QALY.
The team plans to complete recruitment and perform a “detailed evaluation of all the variables, especially those relating to the medication history and changes in dosage, and adverse drug reactions.” The researchers would also like to study genetic phenotypes for other metabolizing enzymes and repeat the pharmacoeconomic analysis with the complete dataset.
A glimpse into the future
Approached for comment, Alessandro Serretti, MD, PhD, department of biomedical and neuromotor sciences, University of Bologna (Italy), who was not involved in the study, said the findings show there is a “small but evident benefit” from CYP profiling, “which makes sense.”
He added that in the Netherlands and other European countries, efforts are already underway to record the CYP status of patients at a national level. “Sooner or later, all Western countries will implement it as a routine,” he said in an interview.
He explained that, when such testing is widely available, electronic health record data will allow physicians to immediately select the optimal antidepressant for an individual patient. This will end the current trial-and-error process that leads to delayed treatment and will help avoid serious consequences, such as suicide.
While reducing a single patient’s treatment by a few weeks with the most appropriate antidepressant choice does not make a large difference in the cost per episode, at a population level, it has the potential to make a significant difference.
Dr. Serretti does not envisage genotyping all 333 million Europeans for the CYP phenotype at this point but imagines that in the future, individuals will undergo whole-genome sequencing to determine risks for cancer, dementia, and heart disease, at which point they will also undergo CYP functional allele profiling, and all these data will be recorded on individuals’ EHR.
“So, every doctor, a psychiatrist or cardiologist, can see everything, whenever they need it,” he said.
The study was funded by Fondazione di Sardegna and Regione Autonoma della Sardegna. The authors disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
new research suggests.
CYP2D6 and CYP2C19, from the cytochrome P450 family, are involved in the metabolism and elimination of various molecules, including medications. Variants in the genes encoding these enzymes affect the speed at which drugs are metabolized, altering their pharmacokinetic profiles.
The researchers studied 125 patients with MDD and used CYP2D6 and CYP2C19 genotyping to determine the presence of actionable phenotypes in line with Food and Drug Administration labeling.
They found that, in many cases, pharmacogenetic testing could have predicted poor response to the initial treatment selection and could have helped guide subsequent choices to improve outcomes.
In addition, a pharmacoeconomic evaluation that combined direct and indirect costs resulting from MDD with the prevalence of CYP2D6 and CYP2C19 phenotypes showed that testing for functional variants in both genes would be cost effective at a national level.
Had psychiatrists who treated patients in the study known about their metabolizing profiles, it “might have contributed to switches in medication” and could have reduced “delays in response,” said lead researcher Alessio Squassina, PhD, associate professor of pharmacology at the University of Cagliari (Italy).
The findings were presented at the European Psychiatric Association 2022 Congress.
Highly variable response rates
Dr. Squassina noted that the response to antidepressants is a “highly variable trait,” and while it is known that genetics play a role, their contribution is “still not completely understood.”
He explained that the use of pharmacogenetics, which leverages genetic information to guide treatment decision-making, has increased significantly.
While regulatory bodies, including the FDA, have been “very active” in defining strict criteria for interpreting the information from pharmacogenetic tests, there remains some “discrepancy” in their clinical utility.
Dr. Squassina said the FDA provides guidance on use of genetic testing on the labels of 34 psychiatric medications. Of these, 79% relate to CYP2D6, 12% relate to CYP2C19, and 9% relate to other genes.
These labels provide guidance on when genetic testing is recommended or required, as well as potentially clinically actionable gene-drug associations in patients with certain functional alleles.
However, Dr. Squassina noted that the distribution of such alleles is not the same across Europe, so it’s possible that a psychiatrist in Italy may be less likely to treat a patient with a phenotype affecting response to treatment or risk of adverse events than one in Norway or Sweden.
For the study, the investigators examined the frequency of CYP2D6 and CYP2C19 phenotypes in psychiatric patients in Sardinia and their relationship with pharmacologic treatment and cost-effectiveness.
They set out to recruit 200 patients with MDD who had a documented 5-year medical and treatment history, including alterations in treatment, adverse events, hospitalizations, suicide, and symptom scores, as well as sociodemographic variables.
An interim analysis of the first 125 patients recruited to the study showed that the most common CYP2D6 phenotype was normal metabolizers (NM), at 60.5%, followed by intermediate metabolizers (IM), at 28.2%, ultrarapid metabolizers (UR), at 8.9%, and poor metabolizers (PM), at 2.4%.
For CYP2C19, the most common phenotype was NM (49%), followed by IM (29.0%), UR (25.0%), and PM (4.0%). While there were differences in the overall European averages, they were not significant.
To highlight the potential impact that pharmacogenetic testing could have had on patient care and outcome, Dr. Squassina highlighted two cases.
The first concerned a patient with a CYP2D6 IM and CYP2C19 UR phenotype, who did not respond to escitalopram. The FDA drug label indicates this phenotype is actionable and recommends an alternative drug.
The patient was subsequently switched to venlafaxine. The FDA drug label on venlafaxine notes that patients with this phenotype are likely to have a suboptimal response to this drug, and again, this patient did not respond to treatment.
Another patient with a CYP2D6 NM and CYP2C19 IM phenotype was also prescribed escitalopram. The FDA label on this drug notes that patients with this phenotype can try venlafaxine but may not respond. Indeed, this patient did not respond and was switched to venlafaxine and started responding.
“The psychiatrists [in these cases] may made have made different [drug] choices if they had known the genotypes in advance,” Dr. Squassina said.
Cost effective?
To determine the cost-effectiveness of screening for CYP2D6 and CYP2C19 phenotypes in patients with MDD, the researchers used real-world data to develop a Markov model with a hypothetical cohort of 2000 MDD patients, half of whom underwent pharmacogenetic testing, to determine the potential impact on outcomes over an 18-week period.
The model included the cost of medications and hospitalization, psychiatric counseling, loss of productivity, and the estimated probability of response and adverse events, adjusted for the patient’s likelihood of having a particular metabolizing phenotype.
Results showed that, for CYP2C19, compared to no testing, pharmacogenetic testing would be cost-effective at an incremental cost-effective ratio (ICER) of €60,000 ($64,000 USD) per quality-adjusted life-year (QALY).
This, Squassina said, is “below the willingness to pay threshold” for health authorities in developed countries.
For CYP2D6, pharmacogenetic testing would become cost-effective at an ICER of approximately €47,000 ($40,000 USD) per QALY.
The team plans to complete recruitment and perform a “detailed evaluation of all the variables, especially those relating to the medication history and changes in dosage, and adverse drug reactions.” The researchers would also like to study genetic phenotypes for other metabolizing enzymes and repeat the pharmacoeconomic analysis with the complete dataset.
A glimpse into the future
Approached for comment, Alessandro Serretti, MD, PhD, department of biomedical and neuromotor sciences, University of Bologna (Italy), who was not involved in the study, said the findings show there is a “small but evident benefit” from CYP profiling, “which makes sense.”
He added that in the Netherlands and other European countries, efforts are already underway to record the CYP status of patients at a national level. “Sooner or later, all Western countries will implement it as a routine,” he said in an interview.
He explained that, when such testing is widely available, electronic health record data will allow physicians to immediately select the optimal antidepressant for an individual patient. This will end the current trial-and-error process that leads to delayed treatment and will help avoid serious consequences, such as suicide.
While reducing a single patient’s treatment by a few weeks with the most appropriate antidepressant choice does not make a large difference in the cost per episode, at a population level, it has the potential to make a significant difference.
Dr. Serretti does not envisage genotyping all 333 million Europeans for the CYP phenotype at this point but imagines that in the future, individuals will undergo whole-genome sequencing to determine risks for cancer, dementia, and heart disease, at which point they will also undergo CYP functional allele profiling, and all these data will be recorded on individuals’ EHR.
“So, every doctor, a psychiatrist or cardiologist, can see everything, whenever they need it,” he said.
The study was funded by Fondazione di Sardegna and Regione Autonoma della Sardegna. The authors disclosed no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM EPA 2022
ADA updates on finerenone, SGLT2 inhibitors, and race-based eGFR
As it gears up for the first in-person scientific sessions for 3 years, the American Diabetes Association has issued an addendum to its most recent annual clinical practice recommendations published in December 2021, the 2022 Standards of Medical Care in Diabetes, based on recent trial evidence and consensus.
The update informs clinicians about:
- The effect of the nonsteroidal mineralocorticoid antagonist (Kerendia) on cardiovascular outcomes in patients with type 2 diabetes and chronic kidney disease.
- The effect of a sodium-glucose cotransporter 2 (SGLT2) inhibitor on heart failure and renal outcomes in patients with type 2 diabetes.
The National Kidney Foundation and the American Society of Nephrology Task Force recommendation to remove race in the formula for calculating estimated glomerular filtration rate (eGFR).
“This is the fifth year that we are able to update the Standards of Care after it has been published through our Living Standards of Care updates, making it possible to give diabetes care providers the most important information and the latest evidence relevant to their practice,” Robert A. Gabbay, MD, PhD, ADA chief scientific and medical officer, said in a press release from the organization.
The addendum, entitled, “Living Standards of Care,” updates Section 10, “Cardiovascular Disease and Risk Management,” and Section 11, “Chronic Kidney Disease and Risk Management” of the 2022 Standards of Medical Care in Diabetes.
The amendments were approved by the ADA Professional Practice Committee, which is responsible for developing the Standards of Care. The American College of Cardiology reviewed and endorsed the section on CVD and risk management.
The Living Standards Update was published online in Diabetes Care.
CVD and risk management
In the Addendum to Section 10, “Cardiovascular Disease and Risk Management,” the committee writes:
- “For patients with type 2 diabetes and chronic kidney disease treated with maximum tolerated doses of ACE inhibitors or angiotensin receptor blockers, addition of finerenone should be considered to improve cardiovascular outcomes and reduce the risk of chronic kidney disease progression. A”
- “Patients with type 2 diabetes and chronic kidney disease should be considered for treatment with finerenone to reduce cardiovascular outcomes and the risk of chronic kidney disease progression.”
- “In patients with type 2 diabetes and established heart failure with either preserved or reduced ejection fraction, an SGLT2 inhibitor [with proven benefit in this patient population] is recommended to reduce risk of worsening heart failure, hospitalizations for heart failure, and cardiovascular death. ”
In the section “Statin Treatment,” the addendum no longer states that “a prospective trial of a newer fibrate ... is ongoing,” because that trial investigating pemafibrate (Kowa), a novel selective peroxisome proliferator-activated receptor alpha modulator (or fibrate), has been discontinued.
Chronic kidney disease and risk management
In the Addendum to Section 11, “Chronic Kidney Disease and Risk Management,” the committee writes:
- “Traditionally, eGFR is calculated from serum creatinine using a validated formula. The Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) equation is preferred. ... Historically, a correction factor for muscle mass was included in a modified equation for African Americans; however, due to various issues with inequities, it was decided to the equation such that it applies to all. Hence, a committee was convened, resulting in the recommendation for immediate implementation of the CKD-EPI creatinine equation refit without the race variable in all laboratories in the U.S.” (This is based on an National Kidney Foundation and American Society of Nephrology Task Force recommendation.)
- “Additionally, increased use of cystatin C, especially to confirm estimated GFR in adults who are at risk for or have chronic kidney disease, because combining filtration markers (creatinine and cystatin C) is more accurate and would support better clinical decisions than either marker alone.”
Evidence from clinical trials
The update is based on findings from the following clinical trials:
- Efficacy and Safety of Finerenone in Subjects With Type 2 Diabetes Mellitus and Diabetic Kidney Disease (FIDELIO-DKD)
- Efficacy and Safety of Finerenone in Subjects With Type 2 Diabetes Mellitus and the Clinical Diagnosis of Diabetic Kidney Disease (FIGARO-DKD)
- FIDELITY, a prespecified pooled analysis of FIDELIO-DKD and FIGARO-DKD
- Empagliflozin Outcome Trial in Patients With Chronic Heart Failure With Preserved Ejection Fraction (EMPEROR-Preserved)
- Effects of Dapagliflozin on Biomarkers, Symptoms and Functional Status in Patients with PRESERVED Ejection Fraction Heart Failure (PRESERVED-HF)
- Pemafibrate to Reduce Cardiovascular Outcomes by Reducing Triglycerides in Patients with Diabetes (PROMINENT).
A version of this article first appeared on Medscape.com.
As it gears up for the first in-person scientific sessions for 3 years, the American Diabetes Association has issued an addendum to its most recent annual clinical practice recommendations published in December 2021, the 2022 Standards of Medical Care in Diabetes, based on recent trial evidence and consensus.
The update informs clinicians about:
- The effect of the nonsteroidal mineralocorticoid antagonist (Kerendia) on cardiovascular outcomes in patients with type 2 diabetes and chronic kidney disease.
- The effect of a sodium-glucose cotransporter 2 (SGLT2) inhibitor on heart failure and renal outcomes in patients with type 2 diabetes.
The National Kidney Foundation and the American Society of Nephrology Task Force recommendation to remove race in the formula for calculating estimated glomerular filtration rate (eGFR).
“This is the fifth year that we are able to update the Standards of Care after it has been published through our Living Standards of Care updates, making it possible to give diabetes care providers the most important information and the latest evidence relevant to their practice,” Robert A. Gabbay, MD, PhD, ADA chief scientific and medical officer, said in a press release from the organization.
The addendum, entitled, “Living Standards of Care,” updates Section 10, “Cardiovascular Disease and Risk Management,” and Section 11, “Chronic Kidney Disease and Risk Management” of the 2022 Standards of Medical Care in Diabetes.
The amendments were approved by the ADA Professional Practice Committee, which is responsible for developing the Standards of Care. The American College of Cardiology reviewed and endorsed the section on CVD and risk management.
The Living Standards Update was published online in Diabetes Care.
CVD and risk management
In the Addendum to Section 10, “Cardiovascular Disease and Risk Management,” the committee writes:
- “For patients with type 2 diabetes and chronic kidney disease treated with maximum tolerated doses of ACE inhibitors or angiotensin receptor blockers, addition of finerenone should be considered to improve cardiovascular outcomes and reduce the risk of chronic kidney disease progression. A”
- “Patients with type 2 diabetes and chronic kidney disease should be considered for treatment with finerenone to reduce cardiovascular outcomes and the risk of chronic kidney disease progression.”
- “In patients with type 2 diabetes and established heart failure with either preserved or reduced ejection fraction, an SGLT2 inhibitor [with proven benefit in this patient population] is recommended to reduce risk of worsening heart failure, hospitalizations for heart failure, and cardiovascular death. ”
In the section “Statin Treatment,” the addendum no longer states that “a prospective trial of a newer fibrate ... is ongoing,” because that trial investigating pemafibrate (Kowa), a novel selective peroxisome proliferator-activated receptor alpha modulator (or fibrate), has been discontinued.
Chronic kidney disease and risk management
In the Addendum to Section 11, “Chronic Kidney Disease and Risk Management,” the committee writes:
- “Traditionally, eGFR is calculated from serum creatinine using a validated formula. The Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) equation is preferred. ... Historically, a correction factor for muscle mass was included in a modified equation for African Americans; however, due to various issues with inequities, it was decided to the equation such that it applies to all. Hence, a committee was convened, resulting in the recommendation for immediate implementation of the CKD-EPI creatinine equation refit without the race variable in all laboratories in the U.S.” (This is based on an National Kidney Foundation and American Society of Nephrology Task Force recommendation.)
- “Additionally, increased use of cystatin C, especially to confirm estimated GFR in adults who are at risk for or have chronic kidney disease, because combining filtration markers (creatinine and cystatin C) is more accurate and would support better clinical decisions than either marker alone.”
Evidence from clinical trials
The update is based on findings from the following clinical trials:
- Efficacy and Safety of Finerenone in Subjects With Type 2 Diabetes Mellitus and Diabetic Kidney Disease (FIDELIO-DKD)
- Efficacy and Safety of Finerenone in Subjects With Type 2 Diabetes Mellitus and the Clinical Diagnosis of Diabetic Kidney Disease (FIGARO-DKD)
- FIDELITY, a prespecified pooled analysis of FIDELIO-DKD and FIGARO-DKD
- Empagliflozin Outcome Trial in Patients With Chronic Heart Failure With Preserved Ejection Fraction (EMPEROR-Preserved)
- Effects of Dapagliflozin on Biomarkers, Symptoms and Functional Status in Patients with PRESERVED Ejection Fraction Heart Failure (PRESERVED-HF)
- Pemafibrate to Reduce Cardiovascular Outcomes by Reducing Triglycerides in Patients with Diabetes (PROMINENT).
A version of this article first appeared on Medscape.com.
As it gears up for the first in-person scientific sessions for 3 years, the American Diabetes Association has issued an addendum to its most recent annual clinical practice recommendations published in December 2021, the 2022 Standards of Medical Care in Diabetes, based on recent trial evidence and consensus.
The update informs clinicians about:
- The effect of the nonsteroidal mineralocorticoid antagonist (Kerendia) on cardiovascular outcomes in patients with type 2 diabetes and chronic kidney disease.
- The effect of a sodium-glucose cotransporter 2 (SGLT2) inhibitor on heart failure and renal outcomes in patients with type 2 diabetes.
The National Kidney Foundation and the American Society of Nephrology Task Force recommendation to remove race in the formula for calculating estimated glomerular filtration rate (eGFR).
“This is the fifth year that we are able to update the Standards of Care after it has been published through our Living Standards of Care updates, making it possible to give diabetes care providers the most important information and the latest evidence relevant to their practice,” Robert A. Gabbay, MD, PhD, ADA chief scientific and medical officer, said in a press release from the organization.
The addendum, entitled, “Living Standards of Care,” updates Section 10, “Cardiovascular Disease and Risk Management,” and Section 11, “Chronic Kidney Disease and Risk Management” of the 2022 Standards of Medical Care in Diabetes.
The amendments were approved by the ADA Professional Practice Committee, which is responsible for developing the Standards of Care. The American College of Cardiology reviewed and endorsed the section on CVD and risk management.
The Living Standards Update was published online in Diabetes Care.
CVD and risk management
In the Addendum to Section 10, “Cardiovascular Disease and Risk Management,” the committee writes:
- “For patients with type 2 diabetes and chronic kidney disease treated with maximum tolerated doses of ACE inhibitors or angiotensin receptor blockers, addition of finerenone should be considered to improve cardiovascular outcomes and reduce the risk of chronic kidney disease progression. A”
- “Patients with type 2 diabetes and chronic kidney disease should be considered for treatment with finerenone to reduce cardiovascular outcomes and the risk of chronic kidney disease progression.”
- “In patients with type 2 diabetes and established heart failure with either preserved or reduced ejection fraction, an SGLT2 inhibitor [with proven benefit in this patient population] is recommended to reduce risk of worsening heart failure, hospitalizations for heart failure, and cardiovascular death. ”
In the section “Statin Treatment,” the addendum no longer states that “a prospective trial of a newer fibrate ... is ongoing,” because that trial investigating pemafibrate (Kowa), a novel selective peroxisome proliferator-activated receptor alpha modulator (or fibrate), has been discontinued.
Chronic kidney disease and risk management
In the Addendum to Section 11, “Chronic Kidney Disease and Risk Management,” the committee writes:
- “Traditionally, eGFR is calculated from serum creatinine using a validated formula. The Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) equation is preferred. ... Historically, a correction factor for muscle mass was included in a modified equation for African Americans; however, due to various issues with inequities, it was decided to the equation such that it applies to all. Hence, a committee was convened, resulting in the recommendation for immediate implementation of the CKD-EPI creatinine equation refit without the race variable in all laboratories in the U.S.” (This is based on an National Kidney Foundation and American Society of Nephrology Task Force recommendation.)
- “Additionally, increased use of cystatin C, especially to confirm estimated GFR in adults who are at risk for or have chronic kidney disease, because combining filtration markers (creatinine and cystatin C) is more accurate and would support better clinical decisions than either marker alone.”
Evidence from clinical trials
The update is based on findings from the following clinical trials:
- Efficacy and Safety of Finerenone in Subjects With Type 2 Diabetes Mellitus and Diabetic Kidney Disease (FIDELIO-DKD)
- Efficacy and Safety of Finerenone in Subjects With Type 2 Diabetes Mellitus and the Clinical Diagnosis of Diabetic Kidney Disease (FIGARO-DKD)
- FIDELITY, a prespecified pooled analysis of FIDELIO-DKD and FIGARO-DKD
- Empagliflozin Outcome Trial in Patients With Chronic Heart Failure With Preserved Ejection Fraction (EMPEROR-Preserved)
- Effects of Dapagliflozin on Biomarkers, Symptoms and Functional Status in Patients with PRESERVED Ejection Fraction Heart Failure (PRESERVED-HF)
- Pemafibrate to Reduce Cardiovascular Outcomes by Reducing Triglycerides in Patients with Diabetes (PROMINENT).
A version of this article first appeared on Medscape.com.
FROM DIABETES CARE
Schizophrenia patients in long-term facilities benefit from lower-dose antipsychotics
NEW ORLEANS –
“There is an argument by some experts in the field that state hospital populations represent a different set of patients who require higher antipsychotic dosages, with no alternative, but I don’t agree with that,” study lead author Mujeeb U. Shad, MD, GME-psychiatry program director and adjunct professor at the University of Nevada, Las Vegas, said in an interview.
In reducing doses, “patients appeared to blossom, becoming more active and less ‘zombie-like’; they started taking more interest in activities and their social [involvement] increased,” he said.
The study was among several presenting pros and cons of high antipsychotic doses at the 2022 annual meeting of the American Psychiatric Association.
Higher doses of antipsychotics are often relied upon when patients with acute psychosis fail to respond to standard treatment, however evidence supporting the approach is lacking.
And while some studies in fact show no benefit from the higher-dose maintenance therapy over conventional or even lower doses of antipsychotics, evidence regarding forensic patients hospitalized in long-term psychiatric facilities is also scant.
Meanwhile, the need to restore competency among those patients can be more pressing than normal.
“In a forensic population where executive cognitive function is one of the key elements to restore competency to stand trial, the continuation of high-dose therapy with excessive dopamine blockade may further compromise preexisting executive dysfunction to delay competency restoration,” Dr. Shad notes in the study.
The study describes a case series in which antipsychotic doses were lowered among 22 of Dr. Shad’s patients who had been determined to be incompetent to stand trial and referred to a state hospital to restore their competency.
With the objective of regaining the mental fitness to stand trial and being discharged from the facility, those on high doses of therapy, defined as a dose greater than 50% of the average package-insert dose, had their doses reduced to conventional dosages.
The approach led to as many as 68% of the patients being stabilized and discharged after having their competency restored, without symptom relapse, following an average antipsychotic dose reduction of 44%.
The average time to discharge following the dose reduction was just 2.3 months, after an average total hospitalization time of 11 months.
The shortest hospitalization durations (less than 7 months) were observed among those who did not receive changes in doses as they were already achieving efficacy with standard dosages.
Among two patients who were treated subtherapeutically, dose increases were required and they had the longest overall hospitalization (14.5 months)
Additional benefits of reduced dosages
Dr. Shad noted that, in addition to the earlier discharges, patients also had reductions in their polypharmacy, and in prolactin.
“We know that high prolactin level is such a huge problem, especially for female patients because it can cause osteoporosis, infertility, and abnormal menstruation, and the reductions in hyperprolactinemia can help reduce weight gain,” he said.
Dr. Shad added that he let some of those effects be his guide in making dose reductions.
“I was trying to gradually minimize the dose while monitoring the patients for relapse, and I used extrapyramidal symptoms and prolactin levels as my guide, looking for a sweet spot with the dosing,” he said.
“For example, if patients were taking an average of about 40-60 mg of a drug, I brought it down close to 20 mg, or close to the average package insert,” Dr. Shad said.
Key concerns among clinicians about reducing antipsychotic doses include the emergence of discontinuation or rebound symptoms, including psychosis, akathisia, or Parkinsonian symptoms, and studies, including a recent meta-analysis have supported those concerns, urging caution in reducing doses below standard levels.
However, Dr. Shad said his series suggests that reducing doses gradually while carefully monitoring extrapyramidal symptoms and prolactin levels may indeed pay off.
“They’re not the perfect guides, but they’re good guides, and with the right approach, [some] may be able to do this,” Dr. Shad said.
“However, the key to a successful dose reduction or discontinuation of an [antipsychotic medication] is to avoid abrupt discontinuation and follow a gradual dose reduction while monitoring symptoms and tolerability,” he said.
Commenting on the research, T. Scott Stroup, MD, a professor of psychiatry at Columbia University, New York, chimed in on the side of urging caution with higher doses and supporting possible benefits with the lower-dose approach.
“I agree that people who need antipsychotic medications should receive the lowest effective dose and that often this is identified by careful dose reduction,” he said in an interview.
Dr. Shad and Stroup had no disclosures to report.
NEW ORLEANS –
“There is an argument by some experts in the field that state hospital populations represent a different set of patients who require higher antipsychotic dosages, with no alternative, but I don’t agree with that,” study lead author Mujeeb U. Shad, MD, GME-psychiatry program director and adjunct professor at the University of Nevada, Las Vegas, said in an interview.
In reducing doses, “patients appeared to blossom, becoming more active and less ‘zombie-like’; they started taking more interest in activities and their social [involvement] increased,” he said.
The study was among several presenting pros and cons of high antipsychotic doses at the 2022 annual meeting of the American Psychiatric Association.
Higher doses of antipsychotics are often relied upon when patients with acute psychosis fail to respond to standard treatment, however evidence supporting the approach is lacking.
And while some studies in fact show no benefit from the higher-dose maintenance therapy over conventional or even lower doses of antipsychotics, evidence regarding forensic patients hospitalized in long-term psychiatric facilities is also scant.
Meanwhile, the need to restore competency among those patients can be more pressing than normal.
“In a forensic population where executive cognitive function is one of the key elements to restore competency to stand trial, the continuation of high-dose therapy with excessive dopamine blockade may further compromise preexisting executive dysfunction to delay competency restoration,” Dr. Shad notes in the study.
The study describes a case series in which antipsychotic doses were lowered among 22 of Dr. Shad’s patients who had been determined to be incompetent to stand trial and referred to a state hospital to restore their competency.
With the objective of regaining the mental fitness to stand trial and being discharged from the facility, those on high doses of therapy, defined as a dose greater than 50% of the average package-insert dose, had their doses reduced to conventional dosages.
The approach led to as many as 68% of the patients being stabilized and discharged after having their competency restored, without symptom relapse, following an average antipsychotic dose reduction of 44%.
The average time to discharge following the dose reduction was just 2.3 months, after an average total hospitalization time of 11 months.
The shortest hospitalization durations (less than 7 months) were observed among those who did not receive changes in doses as they were already achieving efficacy with standard dosages.
Among two patients who were treated subtherapeutically, dose increases were required and they had the longest overall hospitalization (14.5 months)
Additional benefits of reduced dosages
Dr. Shad noted that, in addition to the earlier discharges, patients also had reductions in their polypharmacy, and in prolactin.
“We know that high prolactin level is such a huge problem, especially for female patients because it can cause osteoporosis, infertility, and abnormal menstruation, and the reductions in hyperprolactinemia can help reduce weight gain,” he said.
Dr. Shad added that he let some of those effects be his guide in making dose reductions.
“I was trying to gradually minimize the dose while monitoring the patients for relapse, and I used extrapyramidal symptoms and prolactin levels as my guide, looking for a sweet spot with the dosing,” he said.
“For example, if patients were taking an average of about 40-60 mg of a drug, I brought it down close to 20 mg, or close to the average package insert,” Dr. Shad said.
Key concerns among clinicians about reducing antipsychotic doses include the emergence of discontinuation or rebound symptoms, including psychosis, akathisia, or Parkinsonian symptoms, and studies, including a recent meta-analysis have supported those concerns, urging caution in reducing doses below standard levels.
However, Dr. Shad said his series suggests that reducing doses gradually while carefully monitoring extrapyramidal symptoms and prolactin levels may indeed pay off.
“They’re not the perfect guides, but they’re good guides, and with the right approach, [some] may be able to do this,” Dr. Shad said.
“However, the key to a successful dose reduction or discontinuation of an [antipsychotic medication] is to avoid abrupt discontinuation and follow a gradual dose reduction while monitoring symptoms and tolerability,” he said.
Commenting on the research, T. Scott Stroup, MD, a professor of psychiatry at Columbia University, New York, chimed in on the side of urging caution with higher doses and supporting possible benefits with the lower-dose approach.
“I agree that people who need antipsychotic medications should receive the lowest effective dose and that often this is identified by careful dose reduction,” he said in an interview.
Dr. Shad and Stroup had no disclosures to report.
NEW ORLEANS –
“There is an argument by some experts in the field that state hospital populations represent a different set of patients who require higher antipsychotic dosages, with no alternative, but I don’t agree with that,” study lead author Mujeeb U. Shad, MD, GME-psychiatry program director and adjunct professor at the University of Nevada, Las Vegas, said in an interview.
In reducing doses, “patients appeared to blossom, becoming more active and less ‘zombie-like’; they started taking more interest in activities and their social [involvement] increased,” he said.
The study was among several presenting pros and cons of high antipsychotic doses at the 2022 annual meeting of the American Psychiatric Association.
Higher doses of antipsychotics are often relied upon when patients with acute psychosis fail to respond to standard treatment, however evidence supporting the approach is lacking.
And while some studies in fact show no benefit from the higher-dose maintenance therapy over conventional or even lower doses of antipsychotics, evidence regarding forensic patients hospitalized in long-term psychiatric facilities is also scant.
Meanwhile, the need to restore competency among those patients can be more pressing than normal.
“In a forensic population where executive cognitive function is one of the key elements to restore competency to stand trial, the continuation of high-dose therapy with excessive dopamine blockade may further compromise preexisting executive dysfunction to delay competency restoration,” Dr. Shad notes in the study.
The study describes a case series in which antipsychotic doses were lowered among 22 of Dr. Shad’s patients who had been determined to be incompetent to stand trial and referred to a state hospital to restore their competency.
With the objective of regaining the mental fitness to stand trial and being discharged from the facility, those on high doses of therapy, defined as a dose greater than 50% of the average package-insert dose, had their doses reduced to conventional dosages.
The approach led to as many as 68% of the patients being stabilized and discharged after having their competency restored, without symptom relapse, following an average antipsychotic dose reduction of 44%.
The average time to discharge following the dose reduction was just 2.3 months, after an average total hospitalization time of 11 months.
The shortest hospitalization durations (less than 7 months) were observed among those who did not receive changes in doses as they were already achieving efficacy with standard dosages.
Among two patients who were treated subtherapeutically, dose increases were required and they had the longest overall hospitalization (14.5 months)
Additional benefits of reduced dosages
Dr. Shad noted that, in addition to the earlier discharges, patients also had reductions in their polypharmacy, and in prolactin.
“We know that high prolactin level is such a huge problem, especially for female patients because it can cause osteoporosis, infertility, and abnormal menstruation, and the reductions in hyperprolactinemia can help reduce weight gain,” he said.
Dr. Shad added that he let some of those effects be his guide in making dose reductions.
“I was trying to gradually minimize the dose while monitoring the patients for relapse, and I used extrapyramidal symptoms and prolactin levels as my guide, looking for a sweet spot with the dosing,” he said.
“For example, if patients were taking an average of about 40-60 mg of a drug, I brought it down close to 20 mg, or close to the average package insert,” Dr. Shad said.
Key concerns among clinicians about reducing antipsychotic doses include the emergence of discontinuation or rebound symptoms, including psychosis, akathisia, or Parkinsonian symptoms, and studies, including a recent meta-analysis have supported those concerns, urging caution in reducing doses below standard levels.
However, Dr. Shad said his series suggests that reducing doses gradually while carefully monitoring extrapyramidal symptoms and prolactin levels may indeed pay off.
“They’re not the perfect guides, but they’re good guides, and with the right approach, [some] may be able to do this,” Dr. Shad said.
“However, the key to a successful dose reduction or discontinuation of an [antipsychotic medication] is to avoid abrupt discontinuation and follow a gradual dose reduction while monitoring symptoms and tolerability,” he said.
Commenting on the research, T. Scott Stroup, MD, a professor of psychiatry at Columbia University, New York, chimed in on the side of urging caution with higher doses and supporting possible benefits with the lower-dose approach.
“I agree that people who need antipsychotic medications should receive the lowest effective dose and that often this is identified by careful dose reduction,” he said in an interview.
Dr. Shad and Stroup had no disclosures to report.
FROM APA 2022