User login
Test Your Knowledge: Likelihood Ratios for Differentiating Cirrhotic vs. Non-Cirrhotic Liver Disease
[WpProQuiz 15] [WpProQuiz_toplist 15]
[WpProQuiz 15] [WpProQuiz_toplist 15]
[WpProQuiz 15] [WpProQuiz_toplist 15]
Early change in emotional processing predicts antidepressant response
VIENNA – A novel method of individualizing antidepressant drug therapy while drastically shortening the time required to figure out whether a given agent will be effective in a depressed patient is undergoing its definitive evaluation in five European countries.
“I think this study will provide a critical test of whether we can use these kinds of correlations with emotional processing of information to actually improve the treatment of depression,” Catherine J. Harmer, DPhil, said at the annual congress of the European College of Neuropsychopharmacology.
As early as 2009, she and her coinvestigators demonstrated that depression is associated with a measurable negative bias in the processing of emotional information. Depressed patients selectively pay more attention to and better remember negative information. For example, when depressed patients taking the Facial Expression Recognition Test are quickly shown a photo of a smiling face, they are more likely to describe it as “sad.” Similarly, in a word recall test that includes “positive” adjectives such as cheerful, poised, original, and optimistic and “negative” words such as mean, hostile, domineering, and untidy, they recall fewer positive words than nondepressed people.
This negative emotional bias is a key factor in maintenance of depression. Many weeks before patients report feeling improvement in their mood and clinical symptoms of depression in response to effective antidepressant medication, the drug produces a favorable effect on their cognitive biases in emotional processing, explained Dr. Harmer, professor of cognitive neuroscience and director of the Psychopharmacology and Emotional Research Lab at the University of Oxford (England).
Dr. Harmer hypothesized that antidepressants don’t necessarily act as direct mood enhancers, but instead change the balance away from negative to more positive emotional processing, resulting in neural modulation in limbic and prefrontal circuitry. These neural changes take time to reach the subjective conscious mind, which is why improvement in clinical symptoms of depression doesn’t manifest until 2-3 weeks into therapy, with the drug’s full effects not seen until 6-7 weeks (Br J Psychiatry. 2009 Aug;195[2]:102-8).
“Antidepressants can target cognitive biases surprisingly early on in treatment, before patients report any change in their clinical symptoms. This could explain the delay in antidepressant effect. You need life events, stressors, and environmental stimuli before a change in bias would be expected to result in a change in clinical state,” she said.
Together with her colleagues, she employed functional MRI to study changes in the brain associated with the improvement in negative biases in emotional processing that occur when depressed patients go on antidepressant medication. Forty-two unmedicated depressed patients were randomized to 10 mg of escitalopram (Lexapro) daily for 7 days or placebo. At baseline, all subjects demonstrated amygdala hyperactivity in response to fearful facial expressions, a response that didn’t occur in healthy controls. After 7 days of escitalopram – weeks before any improvement in depressed mood – amygdala activity was normalized in the active treatment group but not in placebo-treated controls (Psychol Med. 2012 Dec;42[12]:2609-17).
Dr. Harmer and a colleague expanded on her theory of depression in a review article aptly titled, “It’s the way that you look at it” (Philos Trans R Soc Lond B Biol Sci. 2013 Feb 25;368[1615]:20120407).
A measurable improvement in emotional processing can be seen within a few hours after a depressed patient takes the first dose of an effective antidepressant. Thus, early change in negative emotional processing is predictive of subsequent clinical outcome. Lack of an early shift to positive emotional processing has been associated in multiple studies by Dr. Harmer and others with a high likelihood that an antidepressant won’t provide significant improvement in depressive symptoms at week 6.
The predictive accuracy of tests of emotional processing is higher when testing is done after a patient has been on an antidepressant medication for a few days rather than after the very first dose. Based upon Dr. Harmer’s work, pharmaceutical companies are now using tests of change in emotional processing at 1 week to help screen and select novel treatments for depression and anxiety.
In PReDicT, depressed patients being treated in primary care clinics across Europe will undergo emotional processing testing at baseline. In the active intervention arm, participants will be retested after 1 week on antidepressant therapy in order to identify those who are unlikely to have a favorable clinical response to that drug, enabling physicians to accelerate decision making about the appropriate next treatment. Treatment decisions in the control group will be made without the emotional processing results, mirroring current everyday practice.
Instead of waiting 4-6 weeks before concluding that a switch to another antidepressant with a different mechanism of action is warranted, as is now routine, participating PReDicT physicians whose patients are in the active intervention arm can make an informed change after just 1 week. The study hypothesis is that participants randomized to this study arm will take less time to respond to antidepressant therapy, because their physicians will be able to find the right drug faster than in the control group. The primary study endpoint will be the percentage of patients in the two study arms showing at least a 50% reduction in their Quick Inventory of Depressive Symptomatology (QIDS SR-16) score at 8 weeks. Secondary endpoints will focus on cumulative health care costs at weeks 24 and 48.
The PReDicT study is being run by P1vital Products Ltd., an Oxfordshire health care company that will use its proprietary Internet-based eHealth Emotional Test Battery to assess early changes in emotional processing. The test battery, which is classified as a medical device, is based upon Dr. Harmer’s earlier work. She is an investigator in PReDicT.
Her research funding comes from the U.K. Medical Research Council, Eli Lilly, and the EU’s Horizon 2020 program.
VIENNA – A novel method of individualizing antidepressant drug therapy while drastically shortening the time required to figure out whether a given agent will be effective in a depressed patient is undergoing its definitive evaluation in five European countries.
“I think this study will provide a critical test of whether we can use these kinds of correlations with emotional processing of information to actually improve the treatment of depression,” Catherine J. Harmer, DPhil, said at the annual congress of the European College of Neuropsychopharmacology.
As early as 2009, she and her coinvestigators demonstrated that depression is associated with a measurable negative bias in the processing of emotional information. Depressed patients selectively pay more attention to and better remember negative information. For example, when depressed patients taking the Facial Expression Recognition Test are quickly shown a photo of a smiling face, they are more likely to describe it as “sad.” Similarly, in a word recall test that includes “positive” adjectives such as cheerful, poised, original, and optimistic and “negative” words such as mean, hostile, domineering, and untidy, they recall fewer positive words than nondepressed people.
This negative emotional bias is a key factor in maintenance of depression. Many weeks before patients report feeling improvement in their mood and clinical symptoms of depression in response to effective antidepressant medication, the drug produces a favorable effect on their cognitive biases in emotional processing, explained Dr. Harmer, professor of cognitive neuroscience and director of the Psychopharmacology and Emotional Research Lab at the University of Oxford (England).
Dr. Harmer hypothesized that antidepressants don’t necessarily act as direct mood enhancers, but instead change the balance away from negative to more positive emotional processing, resulting in neural modulation in limbic and prefrontal circuitry. These neural changes take time to reach the subjective conscious mind, which is why improvement in clinical symptoms of depression doesn’t manifest until 2-3 weeks into therapy, with the drug’s full effects not seen until 6-7 weeks (Br J Psychiatry. 2009 Aug;195[2]:102-8).
“Antidepressants can target cognitive biases surprisingly early on in treatment, before patients report any change in their clinical symptoms. This could explain the delay in antidepressant effect. You need life events, stressors, and environmental stimuli before a change in bias would be expected to result in a change in clinical state,” she said.
Together with her colleagues, she employed functional MRI to study changes in the brain associated with the improvement in negative biases in emotional processing that occur when depressed patients go on antidepressant medication. Forty-two unmedicated depressed patients were randomized to 10 mg of escitalopram (Lexapro) daily for 7 days or placebo. At baseline, all subjects demonstrated amygdala hyperactivity in response to fearful facial expressions, a response that didn’t occur in healthy controls. After 7 days of escitalopram – weeks before any improvement in depressed mood – amygdala activity was normalized in the active treatment group but not in placebo-treated controls (Psychol Med. 2012 Dec;42[12]:2609-17).
Dr. Harmer and a colleague expanded on her theory of depression in a review article aptly titled, “It’s the way that you look at it” (Philos Trans R Soc Lond B Biol Sci. 2013 Feb 25;368[1615]:20120407).
A measurable improvement in emotional processing can be seen within a few hours after a depressed patient takes the first dose of an effective antidepressant. Thus, early change in negative emotional processing is predictive of subsequent clinical outcome. Lack of an early shift to positive emotional processing has been associated in multiple studies by Dr. Harmer and others with a high likelihood that an antidepressant won’t provide significant improvement in depressive symptoms at week 6.
The predictive accuracy of tests of emotional processing is higher when testing is done after a patient has been on an antidepressant medication for a few days rather than after the very first dose. Based upon Dr. Harmer’s work, pharmaceutical companies are now using tests of change in emotional processing at 1 week to help screen and select novel treatments for depression and anxiety.
In PReDicT, depressed patients being treated in primary care clinics across Europe will undergo emotional processing testing at baseline. In the active intervention arm, participants will be retested after 1 week on antidepressant therapy in order to identify those who are unlikely to have a favorable clinical response to that drug, enabling physicians to accelerate decision making about the appropriate next treatment. Treatment decisions in the control group will be made without the emotional processing results, mirroring current everyday practice.
Instead of waiting 4-6 weeks before concluding that a switch to another antidepressant with a different mechanism of action is warranted, as is now routine, participating PReDicT physicians whose patients are in the active intervention arm can make an informed change after just 1 week. The study hypothesis is that participants randomized to this study arm will take less time to respond to antidepressant therapy, because their physicians will be able to find the right drug faster than in the control group. The primary study endpoint will be the percentage of patients in the two study arms showing at least a 50% reduction in their Quick Inventory of Depressive Symptomatology (QIDS SR-16) score at 8 weeks. Secondary endpoints will focus on cumulative health care costs at weeks 24 and 48.
The PReDicT study is being run by P1vital Products Ltd., an Oxfordshire health care company that will use its proprietary Internet-based eHealth Emotional Test Battery to assess early changes in emotional processing. The test battery, which is classified as a medical device, is based upon Dr. Harmer’s earlier work. She is an investigator in PReDicT.
Her research funding comes from the U.K. Medical Research Council, Eli Lilly, and the EU’s Horizon 2020 program.
VIENNA – A novel method of individualizing antidepressant drug therapy while drastically shortening the time required to figure out whether a given agent will be effective in a depressed patient is undergoing its definitive evaluation in five European countries.
“I think this study will provide a critical test of whether we can use these kinds of correlations with emotional processing of information to actually improve the treatment of depression,” Catherine J. Harmer, DPhil, said at the annual congress of the European College of Neuropsychopharmacology.
As early as 2009, she and her coinvestigators demonstrated that depression is associated with a measurable negative bias in the processing of emotional information. Depressed patients selectively pay more attention to and better remember negative information. For example, when depressed patients taking the Facial Expression Recognition Test are quickly shown a photo of a smiling face, they are more likely to describe it as “sad.” Similarly, in a word recall test that includes “positive” adjectives such as cheerful, poised, original, and optimistic and “negative” words such as mean, hostile, domineering, and untidy, they recall fewer positive words than nondepressed people.
This negative emotional bias is a key factor in maintenance of depression. Many weeks before patients report feeling improvement in their mood and clinical symptoms of depression in response to effective antidepressant medication, the drug produces a favorable effect on their cognitive biases in emotional processing, explained Dr. Harmer, professor of cognitive neuroscience and director of the Psychopharmacology and Emotional Research Lab at the University of Oxford (England).
Dr. Harmer hypothesized that antidepressants don’t necessarily act as direct mood enhancers, but instead change the balance away from negative to more positive emotional processing, resulting in neural modulation in limbic and prefrontal circuitry. These neural changes take time to reach the subjective conscious mind, which is why improvement in clinical symptoms of depression doesn’t manifest until 2-3 weeks into therapy, with the drug’s full effects not seen until 6-7 weeks (Br J Psychiatry. 2009 Aug;195[2]:102-8).
“Antidepressants can target cognitive biases surprisingly early on in treatment, before patients report any change in their clinical symptoms. This could explain the delay in antidepressant effect. You need life events, stressors, and environmental stimuli before a change in bias would be expected to result in a change in clinical state,” she said.
Together with her colleagues, she employed functional MRI to study changes in the brain associated with the improvement in negative biases in emotional processing that occur when depressed patients go on antidepressant medication. Forty-two unmedicated depressed patients were randomized to 10 mg of escitalopram (Lexapro) daily for 7 days or placebo. At baseline, all subjects demonstrated amygdala hyperactivity in response to fearful facial expressions, a response that didn’t occur in healthy controls. After 7 days of escitalopram – weeks before any improvement in depressed mood – amygdala activity was normalized in the active treatment group but not in placebo-treated controls (Psychol Med. 2012 Dec;42[12]:2609-17).
Dr. Harmer and a colleague expanded on her theory of depression in a review article aptly titled, “It’s the way that you look at it” (Philos Trans R Soc Lond B Biol Sci. 2013 Feb 25;368[1615]:20120407).
A measurable improvement in emotional processing can be seen within a few hours after a depressed patient takes the first dose of an effective antidepressant. Thus, early change in negative emotional processing is predictive of subsequent clinical outcome. Lack of an early shift to positive emotional processing has been associated in multiple studies by Dr. Harmer and others with a high likelihood that an antidepressant won’t provide significant improvement in depressive symptoms at week 6.
The predictive accuracy of tests of emotional processing is higher when testing is done after a patient has been on an antidepressant medication for a few days rather than after the very first dose. Based upon Dr. Harmer’s work, pharmaceutical companies are now using tests of change in emotional processing at 1 week to help screen and select novel treatments for depression and anxiety.
In PReDicT, depressed patients being treated in primary care clinics across Europe will undergo emotional processing testing at baseline. In the active intervention arm, participants will be retested after 1 week on antidepressant therapy in order to identify those who are unlikely to have a favorable clinical response to that drug, enabling physicians to accelerate decision making about the appropriate next treatment. Treatment decisions in the control group will be made without the emotional processing results, mirroring current everyday practice.
Instead of waiting 4-6 weeks before concluding that a switch to another antidepressant with a different mechanism of action is warranted, as is now routine, participating PReDicT physicians whose patients are in the active intervention arm can make an informed change after just 1 week. The study hypothesis is that participants randomized to this study arm will take less time to respond to antidepressant therapy, because their physicians will be able to find the right drug faster than in the control group. The primary study endpoint will be the percentage of patients in the two study arms showing at least a 50% reduction in their Quick Inventory of Depressive Symptomatology (QIDS SR-16) score at 8 weeks. Secondary endpoints will focus on cumulative health care costs at weeks 24 and 48.
The PReDicT study is being run by P1vital Products Ltd., an Oxfordshire health care company that will use its proprietary Internet-based eHealth Emotional Test Battery to assess early changes in emotional processing. The test battery, which is classified as a medical device, is based upon Dr. Harmer’s earlier work. She is an investigator in PReDicT.
Her research funding comes from the U.K. Medical Research Council, Eli Lilly, and the EU’s Horizon 2020 program.
Injectable male contraceptive on par with other reversible options
An injectable male contraceptive combining long-acting testosterone and progestogen achieved near-complete yet reversible suppression of sperm production, according to findings of a prospective phase II study.
“In the past 4 decades, studies have demonstrated that reversible hormonal suppression of spermatogenesis in men can prevent pregnancies in their female partners, although commercial product development has stalled,” wrote Hermann M. Behre, MD, of the Center for Reproductive Medicine and Andrology, Martin Luther University, Germany, and coauthors.
In this international multicenter study, 320 healthy men aged 18-45 in stable, monogamous, heterosexual partnerships – without known fertility problems but with no desire for pregnancy in the next 2 years – were given intramuscular injections of 200-mg norethisterone enanthate combined with 1,000-mg testosterone undecanoate every 8 weeks.
The study involved phases. There was an initial suppression phase of treatment of up to 32 weeks, during which couples were told to use alternative nonhormonal contraception. Once two consecutive tests showed sperm concentrations at or below 1 million/mL, the couples began the 56-week efficacy phase where injections continued but couples were advised to not use any other form of contraception. This was followed by a recovery phase initiated when sperm concentrations returned above 1 million/mL.
By 24 weeks after the first injection, 95.9% of those who received at least one injection showed sperm suppression to a concentration of less than or equal to 1 million/mL, according to a paper published online Oct. 27 in the Journal of Clinical Endocrinology & Metabolism.
There were four pregnancies during the efficacy phase of the trial, representing a rate of 1.57 per 100 continuing users, but all occurred before the 16th week of the efficacy phase. Three of these four participants had sperm concentrations at or below 1 million/mL but none was azoospermic around the estimated conception date.
“Effective and safe male contraception continues to be elusive,” said Sarah W. Prager, MD, who was asked to comment on the findings. “This study shows that only 75% of men who are already signing up to participate in a male contraceptive research study would be willing to use the proposed method. Additionally, there seem to still be some significant negative side effects, 61% of which were assessed to be directly related to the male contraceptive study drugs.
“A failure rate of 4% in a study setting is significantly lower than most female contraceptive methods, and many women would potentially not feel comfortable relying on a method with that type of failure rate. Finally, male contraception that is not directly witnessed by a female partner could be suspect, and hard to rely on for any but women in long term, monogamous relationships, according to Dr. Prager, associate professor of obstetrics and gynecology and director of the Ryan Family Planning Program at the University of Washington, Seattle.
“Continuing to seek effective and safe male contraception is a worthy endeavor, but this study tells me that we still have a ways to go before male hormonal contraception can be operationalized,” she said in an interview.
Further findings from the study showed that there were six cases of sperm rebound during the efficacy phase, with sperm concentrations ranging from 2 million to 16.6 million/mL. Overall, the treatment showed a combined failure rate of 7.5%, which included nonsuppression by the end of the suppression phase, sperm rebound during the efficacy phase, or pregnancy during the efficacy phase.
By week 52 of the recovery phase, the cumulative rate of recovery of spermatogenesis to a concentration of at least 15 million/mL was 94.8 per 100 continuing users (J Clin Endocrin Metab. 2016, Oct 27. doi: 10.1210/jc.2016-2141).
However, eight men had not recovered spermatogenesis enough to meet the criteria for return to fertility; five of these showed restored sperm counts by week 74 of the recovery phase, two declined further follow-up, and one did not recover within 4 years of the last injection.
Nearly half of all participants reported acne, and 38.1% reported an increase in libido. There were also 65 reports of emotional disorders, although the authors noted that 62 of these reports all came from the same center in Indonesia. Other adverse events included injection site pain (23.1%), myalgia (16.3%), and gynecomastia (5.6%).
While the potential behavioral effects of the regimen were known, the trial was terminated early. The authors argued that similar effects have been observed both in the intervention and placebo arms of previous trials of this combination, which were designed to look only at suppression of spermatogenesis.
“Contraceptive efficacy studies cannot involve placebo groups for obvious ethical reasons,” they wrote. “Therefore, a definitive answer as to whether the potential risks of this hormonal combination for male contraception outweigh the potential benefits cannot be made based on the present results.”
The study was supported by United Nations Development Programme/United Nations Population Fund/United Nations International Children’s Emergency Fund/ World Health Organization/World Bank Special Programme of Research, Development, and Research Training in Human Reproduction (Human Reproduction Programme, World Health Organization, Geneva, Switzerland), and CONRAD (Eastern Virginia Medical School, Arlington, Va.) using funding from the Bill & Melinda Gates Foundation and U.S. Agency for International Development). No relevant conflicts of interest were declared. Dr. Prager reported having no relevant financial conflicts of interest.
An injectable male contraceptive combining long-acting testosterone and progestogen achieved near-complete yet reversible suppression of sperm production, according to findings of a prospective phase II study.
“In the past 4 decades, studies have demonstrated that reversible hormonal suppression of spermatogenesis in men can prevent pregnancies in their female partners, although commercial product development has stalled,” wrote Hermann M. Behre, MD, of the Center for Reproductive Medicine and Andrology, Martin Luther University, Germany, and coauthors.
In this international multicenter study, 320 healthy men aged 18-45 in stable, monogamous, heterosexual partnerships – without known fertility problems but with no desire for pregnancy in the next 2 years – were given intramuscular injections of 200-mg norethisterone enanthate combined with 1,000-mg testosterone undecanoate every 8 weeks.
The study involved phases. There was an initial suppression phase of treatment of up to 32 weeks, during which couples were told to use alternative nonhormonal contraception. Once two consecutive tests showed sperm concentrations at or below 1 million/mL, the couples began the 56-week efficacy phase where injections continued but couples were advised to not use any other form of contraception. This was followed by a recovery phase initiated when sperm concentrations returned above 1 million/mL.
By 24 weeks after the first injection, 95.9% of those who received at least one injection showed sperm suppression to a concentration of less than or equal to 1 million/mL, according to a paper published online Oct. 27 in the Journal of Clinical Endocrinology & Metabolism.
There were four pregnancies during the efficacy phase of the trial, representing a rate of 1.57 per 100 continuing users, but all occurred before the 16th week of the efficacy phase. Three of these four participants had sperm concentrations at or below 1 million/mL but none was azoospermic around the estimated conception date.
“Effective and safe male contraception continues to be elusive,” said Sarah W. Prager, MD, who was asked to comment on the findings. “This study shows that only 75% of men who are already signing up to participate in a male contraceptive research study would be willing to use the proposed method. Additionally, there seem to still be some significant negative side effects, 61% of which were assessed to be directly related to the male contraceptive study drugs.
“A failure rate of 4% in a study setting is significantly lower than most female contraceptive methods, and many women would potentially not feel comfortable relying on a method with that type of failure rate. Finally, male contraception that is not directly witnessed by a female partner could be suspect, and hard to rely on for any but women in long term, monogamous relationships, according to Dr. Prager, associate professor of obstetrics and gynecology and director of the Ryan Family Planning Program at the University of Washington, Seattle.
“Continuing to seek effective and safe male contraception is a worthy endeavor, but this study tells me that we still have a ways to go before male hormonal contraception can be operationalized,” she said in an interview.
Further findings from the study showed that there were six cases of sperm rebound during the efficacy phase, with sperm concentrations ranging from 2 million to 16.6 million/mL. Overall, the treatment showed a combined failure rate of 7.5%, which included nonsuppression by the end of the suppression phase, sperm rebound during the efficacy phase, or pregnancy during the efficacy phase.
By week 52 of the recovery phase, the cumulative rate of recovery of spermatogenesis to a concentration of at least 15 million/mL was 94.8 per 100 continuing users (J Clin Endocrin Metab. 2016, Oct 27. doi: 10.1210/jc.2016-2141).
However, eight men had not recovered spermatogenesis enough to meet the criteria for return to fertility; five of these showed restored sperm counts by week 74 of the recovery phase, two declined further follow-up, and one did not recover within 4 years of the last injection.
Nearly half of all participants reported acne, and 38.1% reported an increase in libido. There were also 65 reports of emotional disorders, although the authors noted that 62 of these reports all came from the same center in Indonesia. Other adverse events included injection site pain (23.1%), myalgia (16.3%), and gynecomastia (5.6%).
While the potential behavioral effects of the regimen were known, the trial was terminated early. The authors argued that similar effects have been observed both in the intervention and placebo arms of previous trials of this combination, which were designed to look only at suppression of spermatogenesis.
“Contraceptive efficacy studies cannot involve placebo groups for obvious ethical reasons,” they wrote. “Therefore, a definitive answer as to whether the potential risks of this hormonal combination for male contraception outweigh the potential benefits cannot be made based on the present results.”
The study was supported by United Nations Development Programme/United Nations Population Fund/United Nations International Children’s Emergency Fund/ World Health Organization/World Bank Special Programme of Research, Development, and Research Training in Human Reproduction (Human Reproduction Programme, World Health Organization, Geneva, Switzerland), and CONRAD (Eastern Virginia Medical School, Arlington, Va.) using funding from the Bill & Melinda Gates Foundation and U.S. Agency for International Development). No relevant conflicts of interest were declared. Dr. Prager reported having no relevant financial conflicts of interest.
An injectable male contraceptive combining long-acting testosterone and progestogen achieved near-complete yet reversible suppression of sperm production, according to findings of a prospective phase II study.
“In the past 4 decades, studies have demonstrated that reversible hormonal suppression of spermatogenesis in men can prevent pregnancies in their female partners, although commercial product development has stalled,” wrote Hermann M. Behre, MD, of the Center for Reproductive Medicine and Andrology, Martin Luther University, Germany, and coauthors.
In this international multicenter study, 320 healthy men aged 18-45 in stable, monogamous, heterosexual partnerships – without known fertility problems but with no desire for pregnancy in the next 2 years – were given intramuscular injections of 200-mg norethisterone enanthate combined with 1,000-mg testosterone undecanoate every 8 weeks.
The study involved phases. There was an initial suppression phase of treatment of up to 32 weeks, during which couples were told to use alternative nonhormonal contraception. Once two consecutive tests showed sperm concentrations at or below 1 million/mL, the couples began the 56-week efficacy phase where injections continued but couples were advised to not use any other form of contraception. This was followed by a recovery phase initiated when sperm concentrations returned above 1 million/mL.
By 24 weeks after the first injection, 95.9% of those who received at least one injection showed sperm suppression to a concentration of less than or equal to 1 million/mL, according to a paper published online Oct. 27 in the Journal of Clinical Endocrinology & Metabolism.
There were four pregnancies during the efficacy phase of the trial, representing a rate of 1.57 per 100 continuing users, but all occurred before the 16th week of the efficacy phase. Three of these four participants had sperm concentrations at or below 1 million/mL but none was azoospermic around the estimated conception date.
“Effective and safe male contraception continues to be elusive,” said Sarah W. Prager, MD, who was asked to comment on the findings. “This study shows that only 75% of men who are already signing up to participate in a male contraceptive research study would be willing to use the proposed method. Additionally, there seem to still be some significant negative side effects, 61% of which were assessed to be directly related to the male contraceptive study drugs.
“A failure rate of 4% in a study setting is significantly lower than most female contraceptive methods, and many women would potentially not feel comfortable relying on a method with that type of failure rate. Finally, male contraception that is not directly witnessed by a female partner could be suspect, and hard to rely on for any but women in long term, monogamous relationships, according to Dr. Prager, associate professor of obstetrics and gynecology and director of the Ryan Family Planning Program at the University of Washington, Seattle.
“Continuing to seek effective and safe male contraception is a worthy endeavor, but this study tells me that we still have a ways to go before male hormonal contraception can be operationalized,” she said in an interview.
Further findings from the study showed that there were six cases of sperm rebound during the efficacy phase, with sperm concentrations ranging from 2 million to 16.6 million/mL. Overall, the treatment showed a combined failure rate of 7.5%, which included nonsuppression by the end of the suppression phase, sperm rebound during the efficacy phase, or pregnancy during the efficacy phase.
By week 52 of the recovery phase, the cumulative rate of recovery of spermatogenesis to a concentration of at least 15 million/mL was 94.8 per 100 continuing users (J Clin Endocrin Metab. 2016, Oct 27. doi: 10.1210/jc.2016-2141).
However, eight men had not recovered spermatogenesis enough to meet the criteria for return to fertility; five of these showed restored sperm counts by week 74 of the recovery phase, two declined further follow-up, and one did not recover within 4 years of the last injection.
Nearly half of all participants reported acne, and 38.1% reported an increase in libido. There were also 65 reports of emotional disorders, although the authors noted that 62 of these reports all came from the same center in Indonesia. Other adverse events included injection site pain (23.1%), myalgia (16.3%), and gynecomastia (5.6%).
While the potential behavioral effects of the regimen were known, the trial was terminated early. The authors argued that similar effects have been observed both in the intervention and placebo arms of previous trials of this combination, which were designed to look only at suppression of spermatogenesis.
“Contraceptive efficacy studies cannot involve placebo groups for obvious ethical reasons,” they wrote. “Therefore, a definitive answer as to whether the potential risks of this hormonal combination for male contraception outweigh the potential benefits cannot be made based on the present results.”
The study was supported by United Nations Development Programme/United Nations Population Fund/United Nations International Children’s Emergency Fund/ World Health Organization/World Bank Special Programme of Research, Development, and Research Training in Human Reproduction (Human Reproduction Programme, World Health Organization, Geneva, Switzerland), and CONRAD (Eastern Virginia Medical School, Arlington, Va.) using funding from the Bill & Melinda Gates Foundation and U.S. Agency for International Development). No relevant conflicts of interest were declared. Dr. Prager reported having no relevant financial conflicts of interest.
FROM JOURNAL OF CLINICAL ENDOCRINOLOGY & METABOLISM
Key clinical point:
Major finding: An injectable male contraceptive showed an overall failure rate of 7.5% but achieved sperm suppression to a concentration at or below 1 million/mL in 95.9% of those who received at least one injection.
Data source: Prospective, multicenter phase II trial of 320 healthy couples.
Disclosures: The study was supported by United Nations Development Programme/United Nations Population Fund/United Nations International Children’s Emergency Fund/World Health Organization/World Bank Special Programme of Research, Development, and Research Training in Human Reproduction (Human Reproduction Progamme, World Health Organization, Geneva, Switzerland), and CONRAD (Eastern Virginia Medical School, Arlington, Va.) using funding from the Bill & Melinda Gates Foundation and U.S. Agency for International Development). No relevant conflicts of interest were declared.
Anti–nerve growth factor drug has long-term OA pain benefit, but unclear safety
Treatment for a year with the human anti–nerve growth factor monoclonal antibody fulranumab provided pain relief and functional benefits to patients with moderate to severe chronic osteoarthritis knee or hip pain but continued to show signs that the biologic may contribute to rapid progression of disease in a proportion of patients who came to need joint replacement, especially when used concurrently with nonsteroidal anti-inflammatory drugs.
The findings come from the double-blind extension phase of a 12-week, phase II, placebo-controlled, double-blind, randomized study that found significant reduction in the average pain intensity score (P less than or equal to .030) for patients who took fulranumab (Pain. 2013 Oct;154[10]:1910-9). The investigators wanted to determine the long-term safety of the biologic in light of the Food and Drug Administration’s 2010 “clinical hold” on studies of anti–nerve growth factor (anti-NGF) drugs such as fulranumab after concerns emerged that the class of anti-NGF antibodies may be associated with potential treatment-emergent adverse events (TEAEs) leading to joint destruction.
The long-term safety and efficacy results in the extension phase of the study when fulranumab was given as an adjunctive therapy to standard pain therapy revealed higher rates of serious TEAEs, including joint replacement, that were associated with fulranumab. While most of those TEAEs were independently adjudicated to stem from normal progression of OA, one-fifth of the patients on fulranumab who needed joint replacement had rapid progression of OA (RPOA).
Investigators led by Panna Sanga, MD, director of clinical research at Janssen, sought to determine the effects of the anti-NGF biologic as an adjunctive therapy to standard pain therapy over a 92-week period in 401 patients who had completed the 12-week efficacy study, as well as over a 26-week posttreatment follow-up. In the current extension phase of the trial, patients continued on their randomized dose in the original trial of placebo or subcutaneous fulranumab 1 mg or 3 mg every 4 weeks or fulranumab 3 mg, 6 mg, or 10 mg every 8 weeks, but they were permitted to change their concurrent pain medications as clinically needed.
Although the study intended for patients to receive 2 years of treatment (104 weeks), the FDA’s clinical hold meant that the median duration of exposure to fulranumab in the extension study was only 365-393 days across the dosing regimens, the authors explained (Arthritis Rheumatol. 2016 Oct 16 doi: 10.1002/art.39943).
Overall, 421 (90%) of 466 intent-to-treat patients experienced at least one TEAE during both study phases, with similar incidence between those randomized to placebo (n = 69, 88%) and the fulranumab groups (n = 352, 91%).
TEAEs that occurred with a frequency of 10% or more among all fulranumab-treated patients were arthralgia (21%), OA (18%), paresthesia, and upper respiratory tract infection (13% each), while these were arthralgia (15%), OA (14%), and sinusitis (12%) among patients randomized to placebo.
A total of 109 patients (23%) reported serious TEAEs, including 13 (17%) taking placebo and 96 (25%) taking fulranumab. Serious TEAEs that were reported in 5% or more of patients in the fulranumab groups were knee arthroplasty (n = 38, 10%) and hip arthroplasty (n = 26, 7%). Overall, 81 joint replacements occurred in 71 patients (placebo: n = , 11%; fulranumab: n = 63, 89%).
An independent adjudication committee that the study sponsor, Janssen, established after the study was put on hold ruled that the majority of these joint replacements resulted from the normal progression of OA (n = 56, 79%). However, the adjudication committee determined that 15 (21%) patients in the fulranumab treatment groups had RPOA.
The study authors pointed out that the patients with RPOA regularly used NSAIDs and had a prior history of OA in the affected joint. Nevertheless, because of the small number of RPOA cases per treatment group, a drug or dose effect for RPOA could not be evaluated, they said.
“Future studies are warranted to demonstrate whether limiting the use of concomitant chronic NSAIDs and using only lower doses of fulranumab may reduce the risk of RPOA,” they wrote.
The extension study also looked at the longer-term efficacy of fulranumab. Efficacy endpoints on the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) pain and physical function subscales and the Patient Global Assessment (PGA) scores that looked at changes from baseline showed that in comparison with placebo, dosing regimens of fulranumab 3 mg every 4 weeks and 10 mg every 8 weeks gave “continued effective relief of the pain associated with knee and hip OA as early as week 4 and was maintained up to week 53,” the researchers reported.
These results were further corroborated by improvements in physical function on the WOMAC physical function subscale and PGA scale scores, which suggested “sustained efficacy of long-term fulranumab treatment,” they said.
All authors except one were employees of Janssen.
Treatment for a year with the human anti–nerve growth factor monoclonal antibody fulranumab provided pain relief and functional benefits to patients with moderate to severe chronic osteoarthritis knee or hip pain but continued to show signs that the biologic may contribute to rapid progression of disease in a proportion of patients who came to need joint replacement, especially when used concurrently with nonsteroidal anti-inflammatory drugs.
The findings come from the double-blind extension phase of a 12-week, phase II, placebo-controlled, double-blind, randomized study that found significant reduction in the average pain intensity score (P less than or equal to .030) for patients who took fulranumab (Pain. 2013 Oct;154[10]:1910-9). The investigators wanted to determine the long-term safety of the biologic in light of the Food and Drug Administration’s 2010 “clinical hold” on studies of anti–nerve growth factor (anti-NGF) drugs such as fulranumab after concerns emerged that the class of anti-NGF antibodies may be associated with potential treatment-emergent adverse events (TEAEs) leading to joint destruction.
The long-term safety and efficacy results in the extension phase of the study when fulranumab was given as an adjunctive therapy to standard pain therapy revealed higher rates of serious TEAEs, including joint replacement, that were associated with fulranumab. While most of those TEAEs were independently adjudicated to stem from normal progression of OA, one-fifth of the patients on fulranumab who needed joint replacement had rapid progression of OA (RPOA).
Investigators led by Panna Sanga, MD, director of clinical research at Janssen, sought to determine the effects of the anti-NGF biologic as an adjunctive therapy to standard pain therapy over a 92-week period in 401 patients who had completed the 12-week efficacy study, as well as over a 26-week posttreatment follow-up. In the current extension phase of the trial, patients continued on their randomized dose in the original trial of placebo or subcutaneous fulranumab 1 mg or 3 mg every 4 weeks or fulranumab 3 mg, 6 mg, or 10 mg every 8 weeks, but they were permitted to change their concurrent pain medications as clinically needed.
Although the study intended for patients to receive 2 years of treatment (104 weeks), the FDA’s clinical hold meant that the median duration of exposure to fulranumab in the extension study was only 365-393 days across the dosing regimens, the authors explained (Arthritis Rheumatol. 2016 Oct 16 doi: 10.1002/art.39943).
Overall, 421 (90%) of 466 intent-to-treat patients experienced at least one TEAE during both study phases, with similar incidence between those randomized to placebo (n = 69, 88%) and the fulranumab groups (n = 352, 91%).
TEAEs that occurred with a frequency of 10% or more among all fulranumab-treated patients were arthralgia (21%), OA (18%), paresthesia, and upper respiratory tract infection (13% each), while these were arthralgia (15%), OA (14%), and sinusitis (12%) among patients randomized to placebo.
A total of 109 patients (23%) reported serious TEAEs, including 13 (17%) taking placebo and 96 (25%) taking fulranumab. Serious TEAEs that were reported in 5% or more of patients in the fulranumab groups were knee arthroplasty (n = 38, 10%) and hip arthroplasty (n = 26, 7%). Overall, 81 joint replacements occurred in 71 patients (placebo: n = , 11%; fulranumab: n = 63, 89%).
An independent adjudication committee that the study sponsor, Janssen, established after the study was put on hold ruled that the majority of these joint replacements resulted from the normal progression of OA (n = 56, 79%). However, the adjudication committee determined that 15 (21%) patients in the fulranumab treatment groups had RPOA.
The study authors pointed out that the patients with RPOA regularly used NSAIDs and had a prior history of OA in the affected joint. Nevertheless, because of the small number of RPOA cases per treatment group, a drug or dose effect for RPOA could not be evaluated, they said.
“Future studies are warranted to demonstrate whether limiting the use of concomitant chronic NSAIDs and using only lower doses of fulranumab may reduce the risk of RPOA,” they wrote.
The extension study also looked at the longer-term efficacy of fulranumab. Efficacy endpoints on the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) pain and physical function subscales and the Patient Global Assessment (PGA) scores that looked at changes from baseline showed that in comparison with placebo, dosing regimens of fulranumab 3 mg every 4 weeks and 10 mg every 8 weeks gave “continued effective relief of the pain associated with knee and hip OA as early as week 4 and was maintained up to week 53,” the researchers reported.
These results were further corroborated by improvements in physical function on the WOMAC physical function subscale and PGA scale scores, which suggested “sustained efficacy of long-term fulranumab treatment,” they said.
All authors except one were employees of Janssen.
Treatment for a year with the human anti–nerve growth factor monoclonal antibody fulranumab provided pain relief and functional benefits to patients with moderate to severe chronic osteoarthritis knee or hip pain but continued to show signs that the biologic may contribute to rapid progression of disease in a proportion of patients who came to need joint replacement, especially when used concurrently with nonsteroidal anti-inflammatory drugs.
The findings come from the double-blind extension phase of a 12-week, phase II, placebo-controlled, double-blind, randomized study that found significant reduction in the average pain intensity score (P less than or equal to .030) for patients who took fulranumab (Pain. 2013 Oct;154[10]:1910-9). The investigators wanted to determine the long-term safety of the biologic in light of the Food and Drug Administration’s 2010 “clinical hold” on studies of anti–nerve growth factor (anti-NGF) drugs such as fulranumab after concerns emerged that the class of anti-NGF antibodies may be associated with potential treatment-emergent adverse events (TEAEs) leading to joint destruction.
The long-term safety and efficacy results in the extension phase of the study when fulranumab was given as an adjunctive therapy to standard pain therapy revealed higher rates of serious TEAEs, including joint replacement, that were associated with fulranumab. While most of those TEAEs were independently adjudicated to stem from normal progression of OA, one-fifth of the patients on fulranumab who needed joint replacement had rapid progression of OA (RPOA).
Investigators led by Panna Sanga, MD, director of clinical research at Janssen, sought to determine the effects of the anti-NGF biologic as an adjunctive therapy to standard pain therapy over a 92-week period in 401 patients who had completed the 12-week efficacy study, as well as over a 26-week posttreatment follow-up. In the current extension phase of the trial, patients continued on their randomized dose in the original trial of placebo or subcutaneous fulranumab 1 mg or 3 mg every 4 weeks or fulranumab 3 mg, 6 mg, or 10 mg every 8 weeks, but they were permitted to change their concurrent pain medications as clinically needed.
Although the study intended for patients to receive 2 years of treatment (104 weeks), the FDA’s clinical hold meant that the median duration of exposure to fulranumab in the extension study was only 365-393 days across the dosing regimens, the authors explained (Arthritis Rheumatol. 2016 Oct 16 doi: 10.1002/art.39943).
Overall, 421 (90%) of 466 intent-to-treat patients experienced at least one TEAE during both study phases, with similar incidence between those randomized to placebo (n = 69, 88%) and the fulranumab groups (n = 352, 91%).
TEAEs that occurred with a frequency of 10% or more among all fulranumab-treated patients were arthralgia (21%), OA (18%), paresthesia, and upper respiratory tract infection (13% each), while these were arthralgia (15%), OA (14%), and sinusitis (12%) among patients randomized to placebo.
A total of 109 patients (23%) reported serious TEAEs, including 13 (17%) taking placebo and 96 (25%) taking fulranumab. Serious TEAEs that were reported in 5% or more of patients in the fulranumab groups were knee arthroplasty (n = 38, 10%) and hip arthroplasty (n = 26, 7%). Overall, 81 joint replacements occurred in 71 patients (placebo: n = , 11%; fulranumab: n = 63, 89%).
An independent adjudication committee that the study sponsor, Janssen, established after the study was put on hold ruled that the majority of these joint replacements resulted from the normal progression of OA (n = 56, 79%). However, the adjudication committee determined that 15 (21%) patients in the fulranumab treatment groups had RPOA.
The study authors pointed out that the patients with RPOA regularly used NSAIDs and had a prior history of OA in the affected joint. Nevertheless, because of the small number of RPOA cases per treatment group, a drug or dose effect for RPOA could not be evaluated, they said.
“Future studies are warranted to demonstrate whether limiting the use of concomitant chronic NSAIDs and using only lower doses of fulranumab may reduce the risk of RPOA,” they wrote.
The extension study also looked at the longer-term efficacy of fulranumab. Efficacy endpoints on the Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC) pain and physical function subscales and the Patient Global Assessment (PGA) scores that looked at changes from baseline showed that in comparison with placebo, dosing regimens of fulranumab 3 mg every 4 weeks and 10 mg every 8 weeks gave “continued effective relief of the pain associated with knee and hip OA as early as week 4 and was maintained up to week 53,” the researchers reported.
These results were further corroborated by improvements in physical function on the WOMAC physical function subscale and PGA scale scores, which suggested “sustained efficacy of long-term fulranumab treatment,” they said.
All authors except one were employees of Janssen.
FROM ARTHRITIS & RHEUMATOLOGY
Key clinical point:
Major finding: A total of 81 joint replacements occurred in 71 patients (placebo: n = 8, 11%; fulranumab: n = 63, 89%).
Data source: Phase II randomized double-blind extension study of 401 patients with moderate to severe chronic OA knee or hip pain.
Disclosures: All authors except one were employees of Janssen, which funded the study.
Ezetimibe’s ACS benefit centers on high-risk, post-CABG patients
ROME – Patients who have undergone coronary artery bypass surgery and who later have an acute coronary syndrome event gain the most from an aggressive lipid-lowering regimen, according to an exploratory analysis of data from more than 18,000 patients enrolled in the IMPROVE-IT trial that tested the incremental benefit from ezetimibe treatment when added to a statin.
Additional exploratory analyses further showed that high-risk acute coronary syndrome (ACS) patients without a history of coronary artery bypass grafting (CABG) also benefited from adding ezetimibe to a background regimen of simvastatin, but the benefit from adding ezetimibe completely disappeared in low-risk ACS patients, Alon Eisen, MD, said at the annual congress of the European Society of Cardiology.
‘The benefit of adding ezetimibe to a statin was enhanced in patients with prior CABG and in other high-risk patients with no prior CABG, supporting the use of more intensive lipid-lowering therapy in these high-risk patients,” said Dr. Eisen, a cardiologist at Brigham and Women’s Hospital in Boston. He also highlighted that ezetimibe is “a safe drug that is coming off patent.” Adding ezetimibe had a moderate effect on LDL cholesterol levels, cutting them from a median of 70 mg/dL in patients in the placebo arm to a median of 54 mg/dL in the group who received ezetimibe.
These results “show that if we pick the right patients, a very benign drug can have a great benefit,” said Eugene Braunwald, MD, a coinvestigator on the IMPROVE-IT trial and a collaborator with Dr. Eisen on the new analysis. The new findings “emphasize that the higher a patient’s risk, the more effect they get from cholesterol-lowering treatment,” said Dr. Braunwald, professor of medicine at Harvard University and a cardiologist at Brigham and Women’s Hospital, both in Boston.
The second exploratory analysis reported by Dr. Eisen looked at the more than 16,000 patients in IMPROVE-IT without history of CABG. The analysis applied a newly developed, nine-item formula for stratifying atherothrombotic risk (Circulation. 2016 July 26;134[4];304-13) to divide these patients into low-, intermediate- and high-risk subgroups. Patients in the high-risk subgroup (20% of the IMPROVE-IT subgroup) had a 6–percentage point reduction in their primary endpoint event rate with added ezetimibe treatment, while those at intermediate risk (31%) got a 2–percentage point decrease in endpoint events, and low-risk patients (49%) actually showed a small, less than 1–percentage point increase in endpoint events with added ezetimibe, Dr. Eisen reported.
IMPROVE-IT was funded by MERCK, the company that markets ezetimibe (Zetia). Dr. Eisen had no disclosures. Dr. Braunwald has been a consultant to Merck as well as to Bayer, Daiichi Sankyo, The Medicines Company, Novartis, and Sanofi.
[email protected]
On Twitter @mitchelzoler
I suspect that the patients in IMPROVE-IT with a history of coronary artery bypass graft surgery were more likely than the other enrolled acute coronary syndrome patients to have more extensive and systemic atherosclerotic disease. Although coronary artery bypass addresses the most acute obstructions to coronary flow that exist at the time of surgery, the procedure does not cure the patient’s underlying vascular disease. We know that a substantial majority of coronary events occur in arteries that are not heavily stenosed.
Another important limitation to keep in mind about the IMPROVE-IT trial was that the background statin treatment all patients received was modest – 40 mg of simvastatin daily. In real-world practice, high-risk patients should go on the most potent statin regimen they can tolerate – ideally, 40 mg daily of rosuvastatin. The need for additional lipid-lowering interventions, with ezetimibe or other drugs, can then be considered as an add-on to aggressive statin therapy.
Richard A. Chazal, MD, is an invasive cardiologist and medical director of the Heart and Vascular Institute of Lee Memorial Health System in Fort Myers, Fla. He is also the current president of the American College of Cardiology. He had no disclosures. He made these comments in an interview.
I suspect that the patients in IMPROVE-IT with a history of coronary artery bypass graft surgery were more likely than the other enrolled acute coronary syndrome patients to have more extensive and systemic atherosclerotic disease. Although coronary artery bypass addresses the most acute obstructions to coronary flow that exist at the time of surgery, the procedure does not cure the patient’s underlying vascular disease. We know that a substantial majority of coronary events occur in arteries that are not heavily stenosed.
Another important limitation to keep in mind about the IMPROVE-IT trial was that the background statin treatment all patients received was modest – 40 mg of simvastatin daily. In real-world practice, high-risk patients should go on the most potent statin regimen they can tolerate – ideally, 40 mg daily of rosuvastatin. The need for additional lipid-lowering interventions, with ezetimibe or other drugs, can then be considered as an add-on to aggressive statin therapy.
Richard A. Chazal, MD, is an invasive cardiologist and medical director of the Heart and Vascular Institute of Lee Memorial Health System in Fort Myers, Fla. He is also the current president of the American College of Cardiology. He had no disclosures. He made these comments in an interview.
I suspect that the patients in IMPROVE-IT with a history of coronary artery bypass graft surgery were more likely than the other enrolled acute coronary syndrome patients to have more extensive and systemic atherosclerotic disease. Although coronary artery bypass addresses the most acute obstructions to coronary flow that exist at the time of surgery, the procedure does not cure the patient’s underlying vascular disease. We know that a substantial majority of coronary events occur in arteries that are not heavily stenosed.
Another important limitation to keep in mind about the IMPROVE-IT trial was that the background statin treatment all patients received was modest – 40 mg of simvastatin daily. In real-world practice, high-risk patients should go on the most potent statin regimen they can tolerate – ideally, 40 mg daily of rosuvastatin. The need for additional lipid-lowering interventions, with ezetimibe or other drugs, can then be considered as an add-on to aggressive statin therapy.
Richard A. Chazal, MD, is an invasive cardiologist and medical director of the Heart and Vascular Institute of Lee Memorial Health System in Fort Myers, Fla. He is also the current president of the American College of Cardiology. He had no disclosures. He made these comments in an interview.
ROME – Patients who have undergone coronary artery bypass surgery and who later have an acute coronary syndrome event gain the most from an aggressive lipid-lowering regimen, according to an exploratory analysis of data from more than 18,000 patients enrolled in the IMPROVE-IT trial that tested the incremental benefit from ezetimibe treatment when added to a statin.
Additional exploratory analyses further showed that high-risk acute coronary syndrome (ACS) patients without a history of coronary artery bypass grafting (CABG) also benefited from adding ezetimibe to a background regimen of simvastatin, but the benefit from adding ezetimibe completely disappeared in low-risk ACS patients, Alon Eisen, MD, said at the annual congress of the European Society of Cardiology.
‘The benefit of adding ezetimibe to a statin was enhanced in patients with prior CABG and in other high-risk patients with no prior CABG, supporting the use of more intensive lipid-lowering therapy in these high-risk patients,” said Dr. Eisen, a cardiologist at Brigham and Women’s Hospital in Boston. He also highlighted that ezetimibe is “a safe drug that is coming off patent.” Adding ezetimibe had a moderate effect on LDL cholesterol levels, cutting them from a median of 70 mg/dL in patients in the placebo arm to a median of 54 mg/dL in the group who received ezetimibe.
These results “show that if we pick the right patients, a very benign drug can have a great benefit,” said Eugene Braunwald, MD, a coinvestigator on the IMPROVE-IT trial and a collaborator with Dr. Eisen on the new analysis. The new findings “emphasize that the higher a patient’s risk, the more effect they get from cholesterol-lowering treatment,” said Dr. Braunwald, professor of medicine at Harvard University and a cardiologist at Brigham and Women’s Hospital, both in Boston.
The second exploratory analysis reported by Dr. Eisen looked at the more than 16,000 patients in IMPROVE-IT without history of CABG. The analysis applied a newly developed, nine-item formula for stratifying atherothrombotic risk (Circulation. 2016 July 26;134[4];304-13) to divide these patients into low-, intermediate- and high-risk subgroups. Patients in the high-risk subgroup (20% of the IMPROVE-IT subgroup) had a 6–percentage point reduction in their primary endpoint event rate with added ezetimibe treatment, while those at intermediate risk (31%) got a 2–percentage point decrease in endpoint events, and low-risk patients (49%) actually showed a small, less than 1–percentage point increase in endpoint events with added ezetimibe, Dr. Eisen reported.
IMPROVE-IT was funded by MERCK, the company that markets ezetimibe (Zetia). Dr. Eisen had no disclosures. Dr. Braunwald has been a consultant to Merck as well as to Bayer, Daiichi Sankyo, The Medicines Company, Novartis, and Sanofi.
[email protected]
On Twitter @mitchelzoler
ROME – Patients who have undergone coronary artery bypass surgery and who later have an acute coronary syndrome event gain the most from an aggressive lipid-lowering regimen, according to an exploratory analysis of data from more than 18,000 patients enrolled in the IMPROVE-IT trial that tested the incremental benefit from ezetimibe treatment when added to a statin.
Additional exploratory analyses further showed that high-risk acute coronary syndrome (ACS) patients without a history of coronary artery bypass grafting (CABG) also benefited from adding ezetimibe to a background regimen of simvastatin, but the benefit from adding ezetimibe completely disappeared in low-risk ACS patients, Alon Eisen, MD, said at the annual congress of the European Society of Cardiology.
‘The benefit of adding ezetimibe to a statin was enhanced in patients with prior CABG and in other high-risk patients with no prior CABG, supporting the use of more intensive lipid-lowering therapy in these high-risk patients,” said Dr. Eisen, a cardiologist at Brigham and Women’s Hospital in Boston. He also highlighted that ezetimibe is “a safe drug that is coming off patent.” Adding ezetimibe had a moderate effect on LDL cholesterol levels, cutting them from a median of 70 mg/dL in patients in the placebo arm to a median of 54 mg/dL in the group who received ezetimibe.
These results “show that if we pick the right patients, a very benign drug can have a great benefit,” said Eugene Braunwald, MD, a coinvestigator on the IMPROVE-IT trial and a collaborator with Dr. Eisen on the new analysis. The new findings “emphasize that the higher a patient’s risk, the more effect they get from cholesterol-lowering treatment,” said Dr. Braunwald, professor of medicine at Harvard University and a cardiologist at Brigham and Women’s Hospital, both in Boston.
The second exploratory analysis reported by Dr. Eisen looked at the more than 16,000 patients in IMPROVE-IT without history of CABG. The analysis applied a newly developed, nine-item formula for stratifying atherothrombotic risk (Circulation. 2016 July 26;134[4];304-13) to divide these patients into low-, intermediate- and high-risk subgroups. Patients in the high-risk subgroup (20% of the IMPROVE-IT subgroup) had a 6–percentage point reduction in their primary endpoint event rate with added ezetimibe treatment, while those at intermediate risk (31%) got a 2–percentage point decrease in endpoint events, and low-risk patients (49%) actually showed a small, less than 1–percentage point increase in endpoint events with added ezetimibe, Dr. Eisen reported.
IMPROVE-IT was funded by MERCK, the company that markets ezetimibe (Zetia). Dr. Eisen had no disclosures. Dr. Braunwald has been a consultant to Merck as well as to Bayer, Daiichi Sankyo, The Medicines Company, Novartis, and Sanofi.
[email protected]
On Twitter @mitchelzoler
AT THE ESC CONGRESS 2016
Key clinical point:
Major finding: The absolute primary-event risk reduction was 9% in post-CABG patients and 1% in all other patients.
Data source: An exploratory, post-hoc analysis of data collected in IMPROVE-IT, a multicenter trial with 18,144 patients.
Disclosures: IMPROVE-IT was funded by MERCK, the company that markets ezetimibe (Zetia). Dr. Eisen had no disclosures. Dr. Braunwald has been a consultant to Merck as well as to Bayer, Daiichi Sankyo, The Medicines Company, Novartis, and Sanofi.
WHO’s push for addressing mental health
The following opinions are my own and not those of the U.S. Department of Defense.
We often lament the shortage of psychiatrists in the United States, but the problem is dire across the globe. According to the World Health Organization, in low- and middle-income countries, 45% of the population lives with less than one psychiatrist per 100,000 citizens.
For many countries, this translates into increased costs for medical care, loss of employee productivity, higher percentages of the population living in poverty, impacts to national security, and an inability to respond to disasters or war. Therefore, significant expansions of mental health approaches need to take place to extend the scope of care beyond traditional therapeutic modalities.
Another meaningful theme was the importance of cultural competence when it comes to patient care. Access to well-trained and supervised clinical and nonclinical providers who are culturally sensitive builds capacity to treat mental, neurologic, and substance use disorders in communities worldwide. Those goals also should include comprehensive care and effective case management, resilience and prevention awareness activities, self-efficacy skills building, technological and innovative capabilities, research and surveillance support, and human rights protections.
Representatives from more than 30 member countries attended the forum, as did United Nations agencies and intergovernmental organizations. Several noteworthy innovative practices emerged. Impressive were projects in Canada that involved all of its representatives in a suicide awareness campaign instituted through a toolkit, increased epilepsy care in remote areas of Ghana, Malta’s law on autism as a policy model, low-intensity interventions in the United Kingdom, and “task-shifting” to nonmedical peer counselors supporting new mothers in India.
Other new guides that complement the mhGAP were introduced, such as a group interpersonal psychotherapy for a depression manual that was piloted in 30 villages in Uganda (JAMA. 2003;289[23]:3117-24) and a Problem Management Plus guide for responding to trauma survivors, which was punctuated by a presentation on Psychological First Aid. Other tools being developed include a prototype of a mobile application for the mhGAP Intervention guide and a suicide prevention toolkit in preparation for World Health Day on April 7, 2017, when “Depression: Let’s talk,” will be the yearlong campaign theme. This will be the first time in 15 years that a mental health condition will be highlighted. This worldwide attention should help raise awareness about the nature of depression, reduce stigma, and encourage help-seeking behavior. It also should act as a call to clinicians who routinely treat patients with major depressive disorder.
I would encourage psychiatrists and primary care physicians to get involved in the “Depression: Let’s talk” campaign and to start making plans for events throughout 2017. For more information on WHO and its mental health efforts, visit online.
Ms. Garrick is a special assistant, manpower and reserve affairs for the U.S. Department of Defense. Previously, she served as the director of the Defense Suicide Prevention Office. She has been a leader in veterans’ disability policy and suicide prevention and peer support programs; worked with Gulf War veterans; and provided individual, group, and family therapy to Vietnam veterans and their families dealing with posttraumatic stress disorder.
The following opinions are my own and not those of the U.S. Department of Defense.
We often lament the shortage of psychiatrists in the United States, but the problem is dire across the globe. According to the World Health Organization, in low- and middle-income countries, 45% of the population lives with less than one psychiatrist per 100,000 citizens.
For many countries, this translates into increased costs for medical care, loss of employee productivity, higher percentages of the population living in poverty, impacts to national security, and an inability to respond to disasters or war. Therefore, significant expansions of mental health approaches need to take place to extend the scope of care beyond traditional therapeutic modalities.
Another meaningful theme was the importance of cultural competence when it comes to patient care. Access to well-trained and supervised clinical and nonclinical providers who are culturally sensitive builds capacity to treat mental, neurologic, and substance use disorders in communities worldwide. Those goals also should include comprehensive care and effective case management, resilience and prevention awareness activities, self-efficacy skills building, technological and innovative capabilities, research and surveillance support, and human rights protections.
Representatives from more than 30 member countries attended the forum, as did United Nations agencies and intergovernmental organizations. Several noteworthy innovative practices emerged. Impressive were projects in Canada that involved all of its representatives in a suicide awareness campaign instituted through a toolkit, increased epilepsy care in remote areas of Ghana, Malta’s law on autism as a policy model, low-intensity interventions in the United Kingdom, and “task-shifting” to nonmedical peer counselors supporting new mothers in India.
Other new guides that complement the mhGAP were introduced, such as a group interpersonal psychotherapy for a depression manual that was piloted in 30 villages in Uganda (JAMA. 2003;289[23]:3117-24) and a Problem Management Plus guide for responding to trauma survivors, which was punctuated by a presentation on Psychological First Aid. Other tools being developed include a prototype of a mobile application for the mhGAP Intervention guide and a suicide prevention toolkit in preparation for World Health Day on April 7, 2017, when “Depression: Let’s talk,” will be the yearlong campaign theme. This will be the first time in 15 years that a mental health condition will be highlighted. This worldwide attention should help raise awareness about the nature of depression, reduce stigma, and encourage help-seeking behavior. It also should act as a call to clinicians who routinely treat patients with major depressive disorder.
I would encourage psychiatrists and primary care physicians to get involved in the “Depression: Let’s talk” campaign and to start making plans for events throughout 2017. For more information on WHO and its mental health efforts, visit online.
Ms. Garrick is a special assistant, manpower and reserve affairs for the U.S. Department of Defense. Previously, she served as the director of the Defense Suicide Prevention Office. She has been a leader in veterans’ disability policy and suicide prevention and peer support programs; worked with Gulf War veterans; and provided individual, group, and family therapy to Vietnam veterans and their families dealing with posttraumatic stress disorder.
The following opinions are my own and not those of the U.S. Department of Defense.
We often lament the shortage of psychiatrists in the United States, but the problem is dire across the globe. According to the World Health Organization, in low- and middle-income countries, 45% of the population lives with less than one psychiatrist per 100,000 citizens.
For many countries, this translates into increased costs for medical care, loss of employee productivity, higher percentages of the population living in poverty, impacts to national security, and an inability to respond to disasters or war. Therefore, significant expansions of mental health approaches need to take place to extend the scope of care beyond traditional therapeutic modalities.
Another meaningful theme was the importance of cultural competence when it comes to patient care. Access to well-trained and supervised clinical and nonclinical providers who are culturally sensitive builds capacity to treat mental, neurologic, and substance use disorders in communities worldwide. Those goals also should include comprehensive care and effective case management, resilience and prevention awareness activities, self-efficacy skills building, technological and innovative capabilities, research and surveillance support, and human rights protections.
Representatives from more than 30 member countries attended the forum, as did United Nations agencies and intergovernmental organizations. Several noteworthy innovative practices emerged. Impressive were projects in Canada that involved all of its representatives in a suicide awareness campaign instituted through a toolkit, increased epilepsy care in remote areas of Ghana, Malta’s law on autism as a policy model, low-intensity interventions in the United Kingdom, and “task-shifting” to nonmedical peer counselors supporting new mothers in India.
Other new guides that complement the mhGAP were introduced, such as a group interpersonal psychotherapy for a depression manual that was piloted in 30 villages in Uganda (JAMA. 2003;289[23]:3117-24) and a Problem Management Plus guide for responding to trauma survivors, which was punctuated by a presentation on Psychological First Aid. Other tools being developed include a prototype of a mobile application for the mhGAP Intervention guide and a suicide prevention toolkit in preparation for World Health Day on April 7, 2017, when “Depression: Let’s talk,” will be the yearlong campaign theme. This will be the first time in 15 years that a mental health condition will be highlighted. This worldwide attention should help raise awareness about the nature of depression, reduce stigma, and encourage help-seeking behavior. It also should act as a call to clinicians who routinely treat patients with major depressive disorder.
I would encourage psychiatrists and primary care physicians to get involved in the “Depression: Let’s talk” campaign and to start making plans for events throughout 2017. For more information on WHO and its mental health efforts, visit online.
Ms. Garrick is a special assistant, manpower and reserve affairs for the U.S. Department of Defense. Previously, she served as the director of the Defense Suicide Prevention Office. She has been a leader in veterans’ disability policy and suicide prevention and peer support programs; worked with Gulf War veterans; and provided individual, group, and family therapy to Vietnam veterans and their families dealing with posttraumatic stress disorder.
Drug prices, not the health law, top voters’ health priorities for 2017
Until this week, when big increases in insurance premiums were unveiled for next year, the federal health law has not been a major issue in the presidential election. In fact, fixing what ails the Affordable Care Act isn’t even among voters’ top priorities for health issues for next year, according to a new poll.
The monthly October tracking poll from the Kaiser Family Foundation finds that, when voters are asked about what the next president and Congress should do about health care, issues relating to prescription drug prices and out-of-pocket spending far outrank proposals to address the shortcomings of the health law. (Kaiser Health News is an editorially independent project of the foundation.)
By contrast, fewer than a third of all voters favored proposals to repeal requirements in the health law for employers to provide health insurance to their workers or pay a fine; reduce the tax subsidies that help people pay their insurance premiums, and eliminate a tax on high-cost health plans.
Republicans (but not Democrats or independents) still overwhelmingly want to repeal the entire health law, with 60% supporting that action. But Republicans are fractured on why they don’t like the law. Asked what their main reason is for their disapproval, nearly a third (31%) said the law “gives government too big a role in the health care system,” while 27% said “the law is just one of many indications that President Obama took the country in the wrong direction.”
The poll also asked voters about adding a government-sponsored “public option” to health plans available to those purchasing insurance in the health law’s marketplaces. Both President Barack Obama and Democratic nominee Hillary Clinton have called for reconsideration of the idea, which narrowly failed to be included in the original law in 2010.
As with the health law itself, semantics matter in this debate over whether to include a government plan to compete with private plans. Even in describing the same concept, a much larger majority (70%) favored the idea of “creating a public health insurance option to compete with private health insurance plans” than favored “creating a government-administered public health insurance option to compete with private health insurance plans” (53%).
Opinions about a public option are also relatively easily swayed when voters are presented with arguments for and against the idea. For example, 21% of supporters shifted to opposition when told that doctors and hospitals might be paid less under a public option, while 13% shifted from opposition to support when told that having a public plan compete with private plans might help drive down costs.
The survey was conducted between Oct. 12 and Oct. 18 among 1,205 adults, using both land lines and cell phones. The margin of error was plus or minus 3%.
This story was produced by Kaiser Health News, a national health policy news service that is part of the nonpartisan Henry J. Kaiser Family Foundation.
Until this week, when big increases in insurance premiums were unveiled for next year, the federal health law has not been a major issue in the presidential election. In fact, fixing what ails the Affordable Care Act isn’t even among voters’ top priorities for health issues for next year, according to a new poll.
The monthly October tracking poll from the Kaiser Family Foundation finds that, when voters are asked about what the next president and Congress should do about health care, issues relating to prescription drug prices and out-of-pocket spending far outrank proposals to address the shortcomings of the health law. (Kaiser Health News is an editorially independent project of the foundation.)
By contrast, fewer than a third of all voters favored proposals to repeal requirements in the health law for employers to provide health insurance to their workers or pay a fine; reduce the tax subsidies that help people pay their insurance premiums, and eliminate a tax on high-cost health plans.
Republicans (but not Democrats or independents) still overwhelmingly want to repeal the entire health law, with 60% supporting that action. But Republicans are fractured on why they don’t like the law. Asked what their main reason is for their disapproval, nearly a third (31%) said the law “gives government too big a role in the health care system,” while 27% said “the law is just one of many indications that President Obama took the country in the wrong direction.”
The poll also asked voters about adding a government-sponsored “public option” to health plans available to those purchasing insurance in the health law’s marketplaces. Both President Barack Obama and Democratic nominee Hillary Clinton have called for reconsideration of the idea, which narrowly failed to be included in the original law in 2010.
As with the health law itself, semantics matter in this debate over whether to include a government plan to compete with private plans. Even in describing the same concept, a much larger majority (70%) favored the idea of “creating a public health insurance option to compete with private health insurance plans” than favored “creating a government-administered public health insurance option to compete with private health insurance plans” (53%).
Opinions about a public option are also relatively easily swayed when voters are presented with arguments for and against the idea. For example, 21% of supporters shifted to opposition when told that doctors and hospitals might be paid less under a public option, while 13% shifted from opposition to support when told that having a public plan compete with private plans might help drive down costs.
The survey was conducted between Oct. 12 and Oct. 18 among 1,205 adults, using both land lines and cell phones. The margin of error was plus or minus 3%.
This story was produced by Kaiser Health News, a national health policy news service that is part of the nonpartisan Henry J. Kaiser Family Foundation.
Until this week, when big increases in insurance premiums were unveiled for next year, the federal health law has not been a major issue in the presidential election. In fact, fixing what ails the Affordable Care Act isn’t even among voters’ top priorities for health issues for next year, according to a new poll.
The monthly October tracking poll from the Kaiser Family Foundation finds that, when voters are asked about what the next president and Congress should do about health care, issues relating to prescription drug prices and out-of-pocket spending far outrank proposals to address the shortcomings of the health law. (Kaiser Health News is an editorially independent project of the foundation.)
By contrast, fewer than a third of all voters favored proposals to repeal requirements in the health law for employers to provide health insurance to their workers or pay a fine; reduce the tax subsidies that help people pay their insurance premiums, and eliminate a tax on high-cost health plans.
Republicans (but not Democrats or independents) still overwhelmingly want to repeal the entire health law, with 60% supporting that action. But Republicans are fractured on why they don’t like the law. Asked what their main reason is for their disapproval, nearly a third (31%) said the law “gives government too big a role in the health care system,” while 27% said “the law is just one of many indications that President Obama took the country in the wrong direction.”
The poll also asked voters about adding a government-sponsored “public option” to health plans available to those purchasing insurance in the health law’s marketplaces. Both President Barack Obama and Democratic nominee Hillary Clinton have called for reconsideration of the idea, which narrowly failed to be included in the original law in 2010.
As with the health law itself, semantics matter in this debate over whether to include a government plan to compete with private plans. Even in describing the same concept, a much larger majority (70%) favored the idea of “creating a public health insurance option to compete with private health insurance plans” than favored “creating a government-administered public health insurance option to compete with private health insurance plans” (53%).
Opinions about a public option are also relatively easily swayed when voters are presented with arguments for and against the idea. For example, 21% of supporters shifted to opposition when told that doctors and hospitals might be paid less under a public option, while 13% shifted from opposition to support when told that having a public plan compete with private plans might help drive down costs.
The survey was conducted between Oct. 12 and Oct. 18 among 1,205 adults, using both land lines and cell phones. The margin of error was plus or minus 3%.
This story was produced by Kaiser Health News, a national health policy news service that is part of the nonpartisan Henry J. Kaiser Family Foundation.
Lower mortality with left-sided colon tumors
Left-sided primary colon tumors are associated with significantly lower mortality than right-sided tumors, according to a systematic review and meta-analysis published online Oct. 27 in JAMA Oncology.
Previous research has suggested that left- and right-sided colon cancer have different biological, molecular, and clinical features, wrote Fausto Petrelli, MD, from the oncology department at ASST Bergamo Ovest (Italy) and his coauthors.
A meta-analysis of 66 studies involving 1,437,846 patients, in which the cancer side was reported and survival data was available, showed patients with a left-sided primary tumor had a significant 18% lower risk of death (P less than .001), compared with those with right-side tumors. This relationship was independent of other factors including stage, race, adjuvant chemotherapy, and the quality of the study, although studies that only included patients with stage IV disease showed an even greater 27% reduction in death in those with left-side tumors.
“An increasingly large amount of evidence is accumulating showing that colon tumors proximal and distal to the splenic flexure are distinct clinical and biological entities,” the authors wrote. “Apart from having a different embryological origin – proximal colon from midgut and distal colon and rectum from hindgut – the right colon displays peculiar differences in mucosal immunology, probably owing to differences in gut microbiota.”
For example, the proximal colon has been found to have a higher concentration of eosinophils and intraepithelial T cells, which may be due to the fact that immune cells in the distal colorectum have to walk an even finer line between immunogenicity against pathogens and tolerance for normal gut microbiota.
“This observation could also explain the differences in immunological response to tumors developing in the proximal colon characterized by an increased immune activity and, in turn, reflect the specific differences in pathogenesis and outcome.”
Given their findings, the authors argued that side of origin should be included as a prognostic factor for both early and advanced disease, and should also be considered in treatment decision making and in future clinical trials.
No conflicts of interest were declared.
Left-sided primary colon tumors are associated with significantly lower mortality than right-sided tumors, according to a systematic review and meta-analysis published online Oct. 27 in JAMA Oncology.
Previous research has suggested that left- and right-sided colon cancer have different biological, molecular, and clinical features, wrote Fausto Petrelli, MD, from the oncology department at ASST Bergamo Ovest (Italy) and his coauthors.
A meta-analysis of 66 studies involving 1,437,846 patients, in which the cancer side was reported and survival data was available, showed patients with a left-sided primary tumor had a significant 18% lower risk of death (P less than .001), compared with those with right-side tumors. This relationship was independent of other factors including stage, race, adjuvant chemotherapy, and the quality of the study, although studies that only included patients with stage IV disease showed an even greater 27% reduction in death in those with left-side tumors.
“An increasingly large amount of evidence is accumulating showing that colon tumors proximal and distal to the splenic flexure are distinct clinical and biological entities,” the authors wrote. “Apart from having a different embryological origin – proximal colon from midgut and distal colon and rectum from hindgut – the right colon displays peculiar differences in mucosal immunology, probably owing to differences in gut microbiota.”
For example, the proximal colon has been found to have a higher concentration of eosinophils and intraepithelial T cells, which may be due to the fact that immune cells in the distal colorectum have to walk an even finer line between immunogenicity against pathogens and tolerance for normal gut microbiota.
“This observation could also explain the differences in immunological response to tumors developing in the proximal colon characterized by an increased immune activity and, in turn, reflect the specific differences in pathogenesis and outcome.”
Given their findings, the authors argued that side of origin should be included as a prognostic factor for both early and advanced disease, and should also be considered in treatment decision making and in future clinical trials.
No conflicts of interest were declared.
Left-sided primary colon tumors are associated with significantly lower mortality than right-sided tumors, according to a systematic review and meta-analysis published online Oct. 27 in JAMA Oncology.
Previous research has suggested that left- and right-sided colon cancer have different biological, molecular, and clinical features, wrote Fausto Petrelli, MD, from the oncology department at ASST Bergamo Ovest (Italy) and his coauthors.
A meta-analysis of 66 studies involving 1,437,846 patients, in which the cancer side was reported and survival data was available, showed patients with a left-sided primary tumor had a significant 18% lower risk of death (P less than .001), compared with those with right-side tumors. This relationship was independent of other factors including stage, race, adjuvant chemotherapy, and the quality of the study, although studies that only included patients with stage IV disease showed an even greater 27% reduction in death in those with left-side tumors.
“An increasingly large amount of evidence is accumulating showing that colon tumors proximal and distal to the splenic flexure are distinct clinical and biological entities,” the authors wrote. “Apart from having a different embryological origin – proximal colon from midgut and distal colon and rectum from hindgut – the right colon displays peculiar differences in mucosal immunology, probably owing to differences in gut microbiota.”
For example, the proximal colon has been found to have a higher concentration of eosinophils and intraepithelial T cells, which may be due to the fact that immune cells in the distal colorectum have to walk an even finer line between immunogenicity against pathogens and tolerance for normal gut microbiota.
“This observation could also explain the differences in immunological response to tumors developing in the proximal colon characterized by an increased immune activity and, in turn, reflect the specific differences in pathogenesis and outcome.”
Given their findings, the authors argued that side of origin should be included as a prognostic factor for both early and advanced disease, and should also be considered in treatment decision making and in future clinical trials.
No conflicts of interest were declared.
FROM JAMA ONCOLOGY
Key clinical point: Left-sided colon tumors are associated with significantly lower mortality than right-sided tumors.
Major finding: Patients with a left-sided primary tumor had a significant 18% lower risk of death, compared with those with a right-side tumor.
Data source: Systematic review and meta-analysis of 66 studies involving 1,437,846 patients.
Disclosures: No relevant conflicts of interest were declared.
Pediatric OSA improved with oral montelukast
The majority of children with obstructive sleep apnea (OSA) who took oral montelukast showed reductions in their apnea-hypopnea index (AHI) scores, in a randomized, double-blind placebo-controlled study.
Typically, OSA in children is treated by adenotonsillectomy, according to Leila Kheirandish-Gozal, MD, director of clinical sleep research at the University of Chicago, and her colleagues. Prior to this study, only one randomized controlled trial had showed that children with mild OSA “responded favorably” to the leukotriene modifier montelukast (Pediatrics. 2012 Aug 31. doi: 10.1542/peds.2012-0310).
Twenty (71%) of the children who received montelukast had fewer AHI events per hour of total sleep time at the end of the study. The average number of such events for these patients was 4.2 plus or minus 2.8 after taking the drug, compared with 9.2 plus or minus 4.1 at the beginning of the study (P less than .0001). Only two (6.9%) of the patients who took the placebo had lower AHI scores at the end of the study, with the average AHI score for the placebo group having been 8.7 plus or minus 4.9 events per hour of total sleep time. At baseline, the average score for patients in the placebo group was 8.2 plus or minus 5.0 AHI events per hour of total sleep time at baseline.
Another improvement seen by patients who received the drug was a decrease in the number of 3% reductions in arterial oxygen saturation per hour of sleep. At the beginning of the study, these patients had 7.2 plus or minus 3.1 of these events; by the end of the study, the number of these events was down to 2.8 plus or minus 1.8 (P less than .001). No significant decrease in the number of these events was seen among patients in the placebo group.
In this study, “montelukast emerges as favorably reducing the severity of OSA short term in children 2-10 years of age. These findings add to the existing evidence supporting a therapeutic role for anti-inflammatory approaches in the management of this highly prevalent condition in children, and clearly justify future studies targeting the long-term benefits of these approaches in children with OSA,” the researchers wrote.
All patients participated in overnight sleep studies following a referral to one of two sleep clinics by their primary care pediatrician or pediatric otolaryngologist, at the beginning of the study. Children who had been diagnosed with symptomatic snoring and had an AHI score of greater than 2 events per hour of total sleep time, and for whom adenotonsillectomy was contemplated, were included in the study.
Central, obstructive, mixed apneic events were counted and hypopneas were assessed. OSA was defined “as the absence of airflow with continued chest wall and abdominal movement for a duration of at least two breaths,” the investigators said. Hypopneas were defined “as a decrease in oronasal flow greater than 50% on either the thermistor or nasal pressure transducer signal. with a corresponding decrease in arterial oxygen saturation greater than 3% or arousal,” Dr. Kheirandish-Gozal and her coauthors said.
Patients were excluded from the study for a variety of reasons, including having severe OSA requiring early surgical intervention.
Adverse events included headache in two children, one from the experimental group and one from the placebo group, and nausea in two subjects from the placebo group and in one from the montelukast group.
Merck provided tablets used in this study. Dr. Kheirandish-Gozal reported grants from Merck and the National Institutes of Health during the conduct of the study. David Gozal, MD, is supported by the Herbert T. Abelson Chair in Pediatrics at the University of Chicago.
The majority of children with obstructive sleep apnea (OSA) who took oral montelukast showed reductions in their apnea-hypopnea index (AHI) scores, in a randomized, double-blind placebo-controlled study.
Typically, OSA in children is treated by adenotonsillectomy, according to Leila Kheirandish-Gozal, MD, director of clinical sleep research at the University of Chicago, and her colleagues. Prior to this study, only one randomized controlled trial had showed that children with mild OSA “responded favorably” to the leukotriene modifier montelukast (Pediatrics. 2012 Aug 31. doi: 10.1542/peds.2012-0310).
Twenty (71%) of the children who received montelukast had fewer AHI events per hour of total sleep time at the end of the study. The average number of such events for these patients was 4.2 plus or minus 2.8 after taking the drug, compared with 9.2 plus or minus 4.1 at the beginning of the study (P less than .0001). Only two (6.9%) of the patients who took the placebo had lower AHI scores at the end of the study, with the average AHI score for the placebo group having been 8.7 plus or minus 4.9 events per hour of total sleep time. At baseline, the average score for patients in the placebo group was 8.2 plus or minus 5.0 AHI events per hour of total sleep time at baseline.
Another improvement seen by patients who received the drug was a decrease in the number of 3% reductions in arterial oxygen saturation per hour of sleep. At the beginning of the study, these patients had 7.2 plus or minus 3.1 of these events; by the end of the study, the number of these events was down to 2.8 plus or minus 1.8 (P less than .001). No significant decrease in the number of these events was seen among patients in the placebo group.
In this study, “montelukast emerges as favorably reducing the severity of OSA short term in children 2-10 years of age. These findings add to the existing evidence supporting a therapeutic role for anti-inflammatory approaches in the management of this highly prevalent condition in children, and clearly justify future studies targeting the long-term benefits of these approaches in children with OSA,” the researchers wrote.
All patients participated in overnight sleep studies following a referral to one of two sleep clinics by their primary care pediatrician or pediatric otolaryngologist, at the beginning of the study. Children who had been diagnosed with symptomatic snoring and had an AHI score of greater than 2 events per hour of total sleep time, and for whom adenotonsillectomy was contemplated, were included in the study.
Central, obstructive, mixed apneic events were counted and hypopneas were assessed. OSA was defined “as the absence of airflow with continued chest wall and abdominal movement for a duration of at least two breaths,” the investigators said. Hypopneas were defined “as a decrease in oronasal flow greater than 50% on either the thermistor or nasal pressure transducer signal. with a corresponding decrease in arterial oxygen saturation greater than 3% or arousal,” Dr. Kheirandish-Gozal and her coauthors said.
Patients were excluded from the study for a variety of reasons, including having severe OSA requiring early surgical intervention.
Adverse events included headache in two children, one from the experimental group and one from the placebo group, and nausea in two subjects from the placebo group and in one from the montelukast group.
Merck provided tablets used in this study. Dr. Kheirandish-Gozal reported grants from Merck and the National Institutes of Health during the conduct of the study. David Gozal, MD, is supported by the Herbert T. Abelson Chair in Pediatrics at the University of Chicago.
The majority of children with obstructive sleep apnea (OSA) who took oral montelukast showed reductions in their apnea-hypopnea index (AHI) scores, in a randomized, double-blind placebo-controlled study.
Typically, OSA in children is treated by adenotonsillectomy, according to Leila Kheirandish-Gozal, MD, director of clinical sleep research at the University of Chicago, and her colleagues. Prior to this study, only one randomized controlled trial had showed that children with mild OSA “responded favorably” to the leukotriene modifier montelukast (Pediatrics. 2012 Aug 31. doi: 10.1542/peds.2012-0310).
Twenty (71%) of the children who received montelukast had fewer AHI events per hour of total sleep time at the end of the study. The average number of such events for these patients was 4.2 plus or minus 2.8 after taking the drug, compared with 9.2 plus or minus 4.1 at the beginning of the study (P less than .0001). Only two (6.9%) of the patients who took the placebo had lower AHI scores at the end of the study, with the average AHI score for the placebo group having been 8.7 plus or minus 4.9 events per hour of total sleep time. At baseline, the average score for patients in the placebo group was 8.2 plus or minus 5.0 AHI events per hour of total sleep time at baseline.
Another improvement seen by patients who received the drug was a decrease in the number of 3% reductions in arterial oxygen saturation per hour of sleep. At the beginning of the study, these patients had 7.2 plus or minus 3.1 of these events; by the end of the study, the number of these events was down to 2.8 plus or minus 1.8 (P less than .001). No significant decrease in the number of these events was seen among patients in the placebo group.
In this study, “montelukast emerges as favorably reducing the severity of OSA short term in children 2-10 years of age. These findings add to the existing evidence supporting a therapeutic role for anti-inflammatory approaches in the management of this highly prevalent condition in children, and clearly justify future studies targeting the long-term benefits of these approaches in children with OSA,” the researchers wrote.
All patients participated in overnight sleep studies following a referral to one of two sleep clinics by their primary care pediatrician or pediatric otolaryngologist, at the beginning of the study. Children who had been diagnosed with symptomatic snoring and had an AHI score of greater than 2 events per hour of total sleep time, and for whom adenotonsillectomy was contemplated, were included in the study.
Central, obstructive, mixed apneic events were counted and hypopneas were assessed. OSA was defined “as the absence of airflow with continued chest wall and abdominal movement for a duration of at least two breaths,” the investigators said. Hypopneas were defined “as a decrease in oronasal flow greater than 50% on either the thermistor or nasal pressure transducer signal. with a corresponding decrease in arterial oxygen saturation greater than 3% or arousal,” Dr. Kheirandish-Gozal and her coauthors said.
Patients were excluded from the study for a variety of reasons, including having severe OSA requiring early surgical intervention.
Adverse events included headache in two children, one from the experimental group and one from the placebo group, and nausea in two subjects from the placebo group and in one from the montelukast group.
Merck provided tablets used in this study. Dr. Kheirandish-Gozal reported grants from Merck and the National Institutes of Health during the conduct of the study. David Gozal, MD, is supported by the Herbert T. Abelson Chair in Pediatrics at the University of Chicago.
Key clinical point:
Major finding: 71% of patients who took montelukast had a significant reduction in AHI events per hour of total sleep time (P less than .0001).
Data source: A prospective, randomized, double-blind placebo-controlled study of 57 children with obstructive sleep apnea.
Disclosures: Merck provided tablets used in this study. Dr. Kheirandish-Gozal reported grants from Merck and the National Institutes of Health during the conduct of the study. David Gozal, MD, is supported by the Herbert T. Abelson Chair in Pediatrics at the University of Chicago.
Genetic screening for CLL premature, speaker says
Photo courtesy of the
National Institute
of General Medical Science
NEW YORK—Research has shown that family history is a strong risk factor for developing chronic lymphocytic leukemia (CLL).
First-degree relatives have an 8.5-fold risk of getting CLL and an increased risk of other lymphoproliferative disorders, according to a study published in 2009.
However, despite the strong evidence of a genetic contribution, one expert believes it’s premature to bring genetic testing into the clinic for screening in CLL.
“At this time, we do not recommend genetic screening,” said Susan Slager, PhD, of the Mayo Clinic in Rochester, Minnesota.
“There’s no known relationship between the inherited variants and treatment response,” she explained, and the relatively low incidence of CLL argues against active screening in affected families at present.
Dr Slager discussed genetic and non-genetic factors associated with CLL and the clinical implications of these factors at Lymphoma & Myeloma 2016.
Demographic risk factors
Dr Slager noted that age, gender, and race are risk factors for CLL.
Individuals aged 65 to 74 have the highest incidence of CLL, at 28%, while the risk is almost non-existent for those under age 20, she said.
There is a higher incidence of CLL in males than in females, and the reason for this gender disparity is unknown.
There is a higher incidence of CLL in Caucasians than Asians, for both males and females.
“Again, it’s unknown why there’s this variability in incidence in CLL,” Dr Slager said. “Obviously, age, sex, and race—these are things you can’t modify. You’re stuck with them.”
However, several studies have been undertaken to look at some of the potentially modifiable factors associated with CLL.
Beyond demographic factors
The International Lymphoma Epidemiology Consortium, known as InterLymph, was initiated in 2001 to evaluate the association of risk factors in CLL. Study centers are located primarily in North America and Europe, with one in Australia.
In one of the larger InterLymph studies, investigators evaluated risk factors—lifestyle exposure, reproductive history, medical history, occupational exposures, farming exposure, and family history—in 2440 CLL patients and 15,186 controls.
The investigators found that sun exposure and atopy—allergies, asthma, eczema, and hay fever—have a protective effect in CLL, while serological hepatitis C virus (HCV) infections, farming exposure, and family history carry an increased risk of CLL.
This confirmed an earlier study conducted in New South Wales, Australia, that had uncovered an inverse association between sun exposure and non-Hodgkin lymphoma (NHL) risk, which fell significantly with increasing recreational sun exposure.
Medical history
Another earlier study from New South Wales revealed a 20% reduction in the risk of NHL for any specific allergy.
However, the investigators of the large, more recent study observed little to no evidence of reduced risk for asthma and eczema.
The underlying biology for atopy or allergies is a hyper-immune system, Dr Slager explained.
“So if you have a hyper-immune system, then we hypothesize that you have protection against CLL,” she said.
Another medical exposure investigators analyzed that impacts CLL risk is HCV. People infected with HCV have an increased risk of CLL, perhaps due to chronic antigen stimulation or possibly disruption of the T-cell function.
Height is also associated with CLL. CLL risk increases with greater height. The concept is that taller individuals have increased exposure to growth hormones that possibly result in cell proliferation.
Another hypothesis supporting the height association is that people of shorter stature experience more infections, which could result in a stronger immune system. And a stronger immune system perhaps protects against NHL.
Occupational exposures
Investigators consistently observed a 20% increased risk of CLL for people living or working on a farm.
Animal farmers, as opposed to crop farmers, experienced some protection. However, the sample size was too small to be conclusive, with only 29 people across all studies being animal farmers.
Among other occupations evaluated, hairdressers also had an increased risk of CLL, although this too was based on a small sample size.
Family history
One of the strongest risk factors for CLL is family history.
Using population-based registry data from Sweden, investigators found that people with a first-degree relative with CLL have an 8.5-fold risk of CLL.
They also have an elevated risk of other lymphoproliferative disorders, including NHL (1.9-fold risk), Waldenström’s macroglobulinemia (4.0-fold risk), hairy cell leukemia (3.3-fold risk), and follicular lymphoma (1.6-fold risk).
GWAS in CLL
Investigators conducted genome-wide association studies (GWAS) to determine what is driving the familial risk.
Dr Slager described these studies as an agnostic approach that looks across the entire genome to determine which regions are associated with a trait of interest.
Typically, many markers are genotyped—somewhere between half a million to 5 million markers—and each is looked at individually with respect to CLL, she said.
Unrelated cases and controls are included in the studies.
The first GWAS study identifying susceptibility loci for CLL was published in 2008. Subsequently, more studies were published with increasing sample sizes—more cases, more controls, and more genetic variants identified.
In the largest meta-analysis for CLL to date (Slager and Houlston et al, not yet published), investigators analyzed 4400 CLL cases and 13,000 controls.
They identified 9 more inherited variances with CLL, for a total of 43 identified to date.
The genes involved follow an apoptosis pathway, the telomere length pathway, and the B-cell lymphocyte development pathway.
“We have to remember, though, that these are non-causal,” Dr Slager cautioned. “We are just identifying the region in the genome that’s associated with CLL . . . . So now we have to dig deeper in these relationships to understand what’s going on.”
Using the identified CLL single-nucleotide polymorphisms, the investigators computed a polygenic risk score. CLL cases in the highest quintile had 2.7-fold increased risk of CLL.
However, the most common GWAS variants explain only 17% of the genetic heritability of CLL, which suggests that more loci are yet to be identified, Dr Slager clarified.
She went on to say that CLL incidence varies by ethnicity. Caucasians have a very high rate of CLL, while Asians have a very low rate. And African Americans have an incidence rate between those of Caucasians and Asians.
Investigators have hypothesized that the differences in incidence are based on the distinct genetic variants that are associated with the ethnicities.
For example, 4 of the variants with more than 20% frequency in Caucasians are quite rare in Chinese individuals and are also quite uncommon in African Americans, with frequencies less than 10%.
Dr Slager suggested that conducting these kinds of studies in Asians and African Americans will take a large sample size and most likely require an international consortium to bring enough CLL cases together.
Impact on clinical practice
Because of the strong genetic risk, patients with CLL naturally want to know about their offspring and their siblings, Dr Slager has found.
Patients who have monoclonal B-cell lymphocytosis (MBL), which is a precursor to CLL, pose the biggest quandary.
MBL is detected in about 5% of people over age 40. However, it’s detected in about 15% to 18% of people with a first-degree relative with CLL.
“These are individuals who have lymphocytosis,” Dr Slager said. “They come to your clinic and have an elevated blood cell count, flow cytometry. [So] you screen them for MBL, and these individuals who have more than 500 cells per microliter, they are the ones who progress to CLL, at 1% per year.”
Individuals who don’t have the elevated blood counts do have the clonal cells, Dr Slager noted.
“They just don’t have the expansion,” she said. “The progression of these individuals to CLL is still yet to be determined.”
For these reasons, Dr Slager believes “it’s still premature to bring genetic testing into clinical practice.”
Future directions include bringing together the non-environmental issues and the inherited issues to create a model that will accurately predict the risk of CLL.
Photo courtesy of the
National Institute
of General Medical Science
NEW YORK—Research has shown that family history is a strong risk factor for developing chronic lymphocytic leukemia (CLL).
First-degree relatives have an 8.5-fold risk of getting CLL and an increased risk of other lymphoproliferative disorders, according to a study published in 2009.
However, despite the strong evidence of a genetic contribution, one expert believes it’s premature to bring genetic testing into the clinic for screening in CLL.
“At this time, we do not recommend genetic screening,” said Susan Slager, PhD, of the Mayo Clinic in Rochester, Minnesota.
“There’s no known relationship between the inherited variants and treatment response,” she explained, and the relatively low incidence of CLL argues against active screening in affected families at present.
Dr Slager discussed genetic and non-genetic factors associated with CLL and the clinical implications of these factors at Lymphoma & Myeloma 2016.
Demographic risk factors
Dr Slager noted that age, gender, and race are risk factors for CLL.
Individuals aged 65 to 74 have the highest incidence of CLL, at 28%, while the risk is almost non-existent for those under age 20, she said.
There is a higher incidence of CLL in males than in females, and the reason for this gender disparity is unknown.
There is a higher incidence of CLL in Caucasians than Asians, for both males and females.
“Again, it’s unknown why there’s this variability in incidence in CLL,” Dr Slager said. “Obviously, age, sex, and race—these are things you can’t modify. You’re stuck with them.”
However, several studies have been undertaken to look at some of the potentially modifiable factors associated with CLL.
Beyond demographic factors
The International Lymphoma Epidemiology Consortium, known as InterLymph, was initiated in 2001 to evaluate the association of risk factors in CLL. Study centers are located primarily in North America and Europe, with one in Australia.
In one of the larger InterLymph studies, investigators evaluated risk factors—lifestyle exposure, reproductive history, medical history, occupational exposures, farming exposure, and family history—in 2440 CLL patients and 15,186 controls.
The investigators found that sun exposure and atopy—allergies, asthma, eczema, and hay fever—have a protective effect in CLL, while serological hepatitis C virus (HCV) infections, farming exposure, and family history carry an increased risk of CLL.
This confirmed an earlier study conducted in New South Wales, Australia, that had uncovered an inverse association between sun exposure and non-Hodgkin lymphoma (NHL) risk, which fell significantly with increasing recreational sun exposure.
Medical history
Another earlier study from New South Wales revealed a 20% reduction in the risk of NHL for any specific allergy.
However, the investigators of the large, more recent study observed little to no evidence of reduced risk for asthma and eczema.
The underlying biology for atopy or allergies is a hyper-immune system, Dr Slager explained.
“So if you have a hyper-immune system, then we hypothesize that you have protection against CLL,” she said.
Another medical exposure investigators analyzed that impacts CLL risk is HCV. People infected with HCV have an increased risk of CLL, perhaps due to chronic antigen stimulation or possibly disruption of the T-cell function.
Height is also associated with CLL. CLL risk increases with greater height. The concept is that taller individuals have increased exposure to growth hormones that possibly result in cell proliferation.
Another hypothesis supporting the height association is that people of shorter stature experience more infections, which could result in a stronger immune system. And a stronger immune system perhaps protects against NHL.
Occupational exposures
Investigators consistently observed a 20% increased risk of CLL for people living or working on a farm.
Animal farmers, as opposed to crop farmers, experienced some protection. However, the sample size was too small to be conclusive, with only 29 people across all studies being animal farmers.
Among other occupations evaluated, hairdressers also had an increased risk of CLL, although this too was based on a small sample size.
Family history
One of the strongest risk factors for CLL is family history.
Using population-based registry data from Sweden, investigators found that people with a first-degree relative with CLL have an 8.5-fold risk of CLL.
They also have an elevated risk of other lymphoproliferative disorders, including NHL (1.9-fold risk), Waldenström’s macroglobulinemia (4.0-fold risk), hairy cell leukemia (3.3-fold risk), and follicular lymphoma (1.6-fold risk).
GWAS in CLL
Investigators conducted genome-wide association studies (GWAS) to determine what is driving the familial risk.
Dr Slager described these studies as an agnostic approach that looks across the entire genome to determine which regions are associated with a trait of interest.
Typically, many markers are genotyped—somewhere between half a million to 5 million markers—and each is looked at individually with respect to CLL, she said.
Unrelated cases and controls are included in the studies.
The first GWAS study identifying susceptibility loci for CLL was published in 2008. Subsequently, more studies were published with increasing sample sizes—more cases, more controls, and more genetic variants identified.
In the largest meta-analysis for CLL to date (Slager and Houlston et al, not yet published), investigators analyzed 4400 CLL cases and 13,000 controls.
They identified 9 more inherited variances with CLL, for a total of 43 identified to date.
The genes involved follow an apoptosis pathway, the telomere length pathway, and the B-cell lymphocyte development pathway.
“We have to remember, though, that these are non-causal,” Dr Slager cautioned. “We are just identifying the region in the genome that’s associated with CLL . . . . So now we have to dig deeper in these relationships to understand what’s going on.”
Using the identified CLL single-nucleotide polymorphisms, the investigators computed a polygenic risk score. CLL cases in the highest quintile had 2.7-fold increased risk of CLL.
However, the most common GWAS variants explain only 17% of the genetic heritability of CLL, which suggests that more loci are yet to be identified, Dr Slager clarified.
She went on to say that CLL incidence varies by ethnicity. Caucasians have a very high rate of CLL, while Asians have a very low rate. And African Americans have an incidence rate between those of Caucasians and Asians.
Investigators have hypothesized that the differences in incidence are based on the distinct genetic variants that are associated with the ethnicities.
For example, 4 of the variants with more than 20% frequency in Caucasians are quite rare in Chinese individuals and are also quite uncommon in African Americans, with frequencies less than 10%.
Dr Slager suggested that conducting these kinds of studies in Asians and African Americans will take a large sample size and most likely require an international consortium to bring enough CLL cases together.
Impact on clinical practice
Because of the strong genetic risk, patients with CLL naturally want to know about their offspring and their siblings, Dr Slager has found.
Patients who have monoclonal B-cell lymphocytosis (MBL), which is a precursor to CLL, pose the biggest quandary.
MBL is detected in about 5% of people over age 40. However, it’s detected in about 15% to 18% of people with a first-degree relative with CLL.
“These are individuals who have lymphocytosis,” Dr Slager said. “They come to your clinic and have an elevated blood cell count, flow cytometry. [So] you screen them for MBL, and these individuals who have more than 500 cells per microliter, they are the ones who progress to CLL, at 1% per year.”
Individuals who don’t have the elevated blood counts do have the clonal cells, Dr Slager noted.
“They just don’t have the expansion,” she said. “The progression of these individuals to CLL is still yet to be determined.”
For these reasons, Dr Slager believes “it’s still premature to bring genetic testing into clinical practice.”
Future directions include bringing together the non-environmental issues and the inherited issues to create a model that will accurately predict the risk of CLL.
Photo courtesy of the
National Institute
of General Medical Science
NEW YORK—Research has shown that family history is a strong risk factor for developing chronic lymphocytic leukemia (CLL).
First-degree relatives have an 8.5-fold risk of getting CLL and an increased risk of other lymphoproliferative disorders, according to a study published in 2009.
However, despite the strong evidence of a genetic contribution, one expert believes it’s premature to bring genetic testing into the clinic for screening in CLL.
“At this time, we do not recommend genetic screening,” said Susan Slager, PhD, of the Mayo Clinic in Rochester, Minnesota.
“There’s no known relationship between the inherited variants and treatment response,” she explained, and the relatively low incidence of CLL argues against active screening in affected families at present.
Dr Slager discussed genetic and non-genetic factors associated with CLL and the clinical implications of these factors at Lymphoma & Myeloma 2016.
Demographic risk factors
Dr Slager noted that age, gender, and race are risk factors for CLL.
Individuals aged 65 to 74 have the highest incidence of CLL, at 28%, while the risk is almost non-existent for those under age 20, she said.
There is a higher incidence of CLL in males than in females, and the reason for this gender disparity is unknown.
There is a higher incidence of CLL in Caucasians than Asians, for both males and females.
“Again, it’s unknown why there’s this variability in incidence in CLL,” Dr Slager said. “Obviously, age, sex, and race—these are things you can’t modify. You’re stuck with them.”
However, several studies have been undertaken to look at some of the potentially modifiable factors associated with CLL.
Beyond demographic factors
The International Lymphoma Epidemiology Consortium, known as InterLymph, was initiated in 2001 to evaluate the association of risk factors in CLL. Study centers are located primarily in North America and Europe, with one in Australia.
In one of the larger InterLymph studies, investigators evaluated risk factors—lifestyle exposure, reproductive history, medical history, occupational exposures, farming exposure, and family history—in 2440 CLL patients and 15,186 controls.
The investigators found that sun exposure and atopy—allergies, asthma, eczema, and hay fever—have a protective effect in CLL, while serological hepatitis C virus (HCV) infections, farming exposure, and family history carry an increased risk of CLL.
This confirmed an earlier study conducted in New South Wales, Australia, that had uncovered an inverse association between sun exposure and non-Hodgkin lymphoma (NHL) risk, which fell significantly with increasing recreational sun exposure.
Medical history
Another earlier study from New South Wales revealed a 20% reduction in the risk of NHL for any specific allergy.
However, the investigators of the large, more recent study observed little to no evidence of reduced risk for asthma and eczema.
The underlying biology for atopy or allergies is a hyper-immune system, Dr Slager explained.
“So if you have a hyper-immune system, then we hypothesize that you have protection against CLL,” she said.
Another medical exposure investigators analyzed that impacts CLL risk is HCV. People infected with HCV have an increased risk of CLL, perhaps due to chronic antigen stimulation or possibly disruption of the T-cell function.
Height is also associated with CLL. CLL risk increases with greater height. The concept is that taller individuals have increased exposure to growth hormones that possibly result in cell proliferation.
Another hypothesis supporting the height association is that people of shorter stature experience more infections, which could result in a stronger immune system. And a stronger immune system perhaps protects against NHL.
Occupational exposures
Investigators consistently observed a 20% increased risk of CLL for people living or working on a farm.
Animal farmers, as opposed to crop farmers, experienced some protection. However, the sample size was too small to be conclusive, with only 29 people across all studies being animal farmers.
Among other occupations evaluated, hairdressers also had an increased risk of CLL, although this too was based on a small sample size.
Family history
One of the strongest risk factors for CLL is family history.
Using population-based registry data from Sweden, investigators found that people with a first-degree relative with CLL have an 8.5-fold risk of CLL.
They also have an elevated risk of other lymphoproliferative disorders, including NHL (1.9-fold risk), Waldenström’s macroglobulinemia (4.0-fold risk), hairy cell leukemia (3.3-fold risk), and follicular lymphoma (1.6-fold risk).
GWAS in CLL
Investigators conducted genome-wide association studies (GWAS) to determine what is driving the familial risk.
Dr Slager described these studies as an agnostic approach that looks across the entire genome to determine which regions are associated with a trait of interest.
Typically, many markers are genotyped—somewhere between half a million to 5 million markers—and each is looked at individually with respect to CLL, she said.
Unrelated cases and controls are included in the studies.
The first GWAS study identifying susceptibility loci for CLL was published in 2008. Subsequently, more studies were published with increasing sample sizes—more cases, more controls, and more genetic variants identified.
In the largest meta-analysis for CLL to date (Slager and Houlston et al, not yet published), investigators analyzed 4400 CLL cases and 13,000 controls.
They identified 9 more inherited variances with CLL, for a total of 43 identified to date.
The genes involved follow an apoptosis pathway, the telomere length pathway, and the B-cell lymphocyte development pathway.
“We have to remember, though, that these are non-causal,” Dr Slager cautioned. “We are just identifying the region in the genome that’s associated with CLL . . . . So now we have to dig deeper in these relationships to understand what’s going on.”
Using the identified CLL single-nucleotide polymorphisms, the investigators computed a polygenic risk score. CLL cases in the highest quintile had 2.7-fold increased risk of CLL.
However, the most common GWAS variants explain only 17% of the genetic heritability of CLL, which suggests that more loci are yet to be identified, Dr Slager clarified.
She went on to say that CLL incidence varies by ethnicity. Caucasians have a very high rate of CLL, while Asians have a very low rate. And African Americans have an incidence rate between those of Caucasians and Asians.
Investigators have hypothesized that the differences in incidence are based on the distinct genetic variants that are associated with the ethnicities.
For example, 4 of the variants with more than 20% frequency in Caucasians are quite rare in Chinese individuals and are also quite uncommon in African Americans, with frequencies less than 10%.
Dr Slager suggested that conducting these kinds of studies in Asians and African Americans will take a large sample size and most likely require an international consortium to bring enough CLL cases together.
Impact on clinical practice
Because of the strong genetic risk, patients with CLL naturally want to know about their offspring and their siblings, Dr Slager has found.
Patients who have monoclonal B-cell lymphocytosis (MBL), which is a precursor to CLL, pose the biggest quandary.
MBL is detected in about 5% of people over age 40. However, it’s detected in about 15% to 18% of people with a first-degree relative with CLL.
“These are individuals who have lymphocytosis,” Dr Slager said. “They come to your clinic and have an elevated blood cell count, flow cytometry. [So] you screen them for MBL, and these individuals who have more than 500 cells per microliter, they are the ones who progress to CLL, at 1% per year.”
Individuals who don’t have the elevated blood counts do have the clonal cells, Dr Slager noted.
“They just don’t have the expansion,” she said. “The progression of these individuals to CLL is still yet to be determined.”
For these reasons, Dr Slager believes “it’s still premature to bring genetic testing into clinical practice.”
Future directions include bringing together the non-environmental issues and the inherited issues to create a model that will accurately predict the risk of CLL.