User login
Preventing weight gain after smoking cessation
About three-quarters of current cigarette smokers want to quit, 40% will attempt to quit annually, and 90% of self-initiated attempts will be unsuccessful. Mean weight gain after smoking cessation may be as much as 13 pounds at 1 year, and 21 pounds over 5 years.
Population data suggest that more than one-half of women and one-third of men with a previous attempt to quit smoking report that weight gain was one of the primary reasons for relapse back to smoking.
Lorcaserin is a 5-HT2c (serotonin) receptor agonist FDA-approved for weight loss. Varenicline is the most effective monotherapy for smoking cessation and targets the alpha-4 beta-2 nicotinic acetylcholine receptor.
Ryan Hurt, MD, and his colleagues recently completed a pilot clinical trial evaluating the potential efficacy of combining varenicline and lorcaserin for the prevention of PCWG in obese and overweight smokers (Nicotine Tob Res. 2016 Nov 16. doi: 10.1093/ntr/ntw304).
In this study, 20 smokers with a body mass index of 27-40 kg/m2 received varenicline and lorcaserin for 12 weeks.
Fifty percent of subjects were abstinent from smoking at 12 weeks, among whom weight gain was only +1.1 ± 3.9 kg (90% confidence interval, –0.9 to +3.1). The most-common side effect of the combination was sleep disturbance, reported by five patients.
The study was limited by the small sample size and the absence of a control group or placebo.
As clinicians, we frequently employ combination therapy in chronic diseases such as diabetes and hypertension when single-agent therapy is ineffective. By combining drugs with different therapeutic targets, we can achieve our treatment goals.
In tobacco dependence treatment, we use combination pharmacotherapy for heavier smokers or for those who have tried and failed to quit previously. Interestingly, lorcaserin has been demonstrated in another pilot study to increase smoking cessation rates by itself.
The combination of lorcaserin and varenicline holds promise for the treatment of tobacco dependence by attacking tobacco dependence through two different mechanisms and preventing PCWG, which may prevent relapse back to smoking.
Dr. Ebbert is professor of medicine, a general internist at the Mayo Clinic in Rochester, Minn., and a diplomate of the American Board of Addiction Medicine. The opinions expressed are those of the author and do not necessarily represent the views and opinions of the Mayo Clinic. The opinions expressed in this article should not be used to diagnose or treat any medical condition, nor should they be used as a substitute for medical advice from a qualified, board-certified practicing clinician. Dr. Ebbert has no relevant financial disclosures about this article.
About three-quarters of current cigarette smokers want to quit, 40% will attempt to quit annually, and 90% of self-initiated attempts will be unsuccessful. Mean weight gain after smoking cessation may be as much as 13 pounds at 1 year, and 21 pounds over 5 years.
Population data suggest that more than one-half of women and one-third of men with a previous attempt to quit smoking report that weight gain was one of the primary reasons for relapse back to smoking.
Lorcaserin is a 5-HT2c (serotonin) receptor agonist FDA-approved for weight loss. Varenicline is the most effective monotherapy for smoking cessation and targets the alpha-4 beta-2 nicotinic acetylcholine receptor.
Ryan Hurt, MD, and his colleagues recently completed a pilot clinical trial evaluating the potential efficacy of combining varenicline and lorcaserin for the prevention of PCWG in obese and overweight smokers (Nicotine Tob Res. 2016 Nov 16. doi: 10.1093/ntr/ntw304).
In this study, 20 smokers with a body mass index of 27-40 kg/m2 received varenicline and lorcaserin for 12 weeks.
Fifty percent of subjects were abstinent from smoking at 12 weeks, among whom weight gain was only +1.1 ± 3.9 kg (90% confidence interval, –0.9 to +3.1). The most-common side effect of the combination was sleep disturbance, reported by five patients.
The study was limited by the small sample size and the absence of a control group or placebo.
As clinicians, we frequently employ combination therapy in chronic diseases such as diabetes and hypertension when single-agent therapy is ineffective. By combining drugs with different therapeutic targets, we can achieve our treatment goals.
In tobacco dependence treatment, we use combination pharmacotherapy for heavier smokers or for those who have tried and failed to quit previously. Interestingly, lorcaserin has been demonstrated in another pilot study to increase smoking cessation rates by itself.
The combination of lorcaserin and varenicline holds promise for the treatment of tobacco dependence by attacking tobacco dependence through two different mechanisms and preventing PCWG, which may prevent relapse back to smoking.
Dr. Ebbert is professor of medicine, a general internist at the Mayo Clinic in Rochester, Minn., and a diplomate of the American Board of Addiction Medicine. The opinions expressed are those of the author and do not necessarily represent the views and opinions of the Mayo Clinic. The opinions expressed in this article should not be used to diagnose or treat any medical condition, nor should they be used as a substitute for medical advice from a qualified, board-certified practicing clinician. Dr. Ebbert has no relevant financial disclosures about this article.
About three-quarters of current cigarette smokers want to quit, 40% will attempt to quit annually, and 90% of self-initiated attempts will be unsuccessful. Mean weight gain after smoking cessation may be as much as 13 pounds at 1 year, and 21 pounds over 5 years.
Population data suggest that more than one-half of women and one-third of men with a previous attempt to quit smoking report that weight gain was one of the primary reasons for relapse back to smoking.
Lorcaserin is a 5-HT2c (serotonin) receptor agonist FDA-approved for weight loss. Varenicline is the most effective monotherapy for smoking cessation and targets the alpha-4 beta-2 nicotinic acetylcholine receptor.
Ryan Hurt, MD, and his colleagues recently completed a pilot clinical trial evaluating the potential efficacy of combining varenicline and lorcaserin for the prevention of PCWG in obese and overweight smokers (Nicotine Tob Res. 2016 Nov 16. doi: 10.1093/ntr/ntw304).
In this study, 20 smokers with a body mass index of 27-40 kg/m2 received varenicline and lorcaserin for 12 weeks.
Fifty percent of subjects were abstinent from smoking at 12 weeks, among whom weight gain was only +1.1 ± 3.9 kg (90% confidence interval, –0.9 to +3.1). The most-common side effect of the combination was sleep disturbance, reported by five patients.
The study was limited by the small sample size and the absence of a control group or placebo.
As clinicians, we frequently employ combination therapy in chronic diseases such as diabetes and hypertension when single-agent therapy is ineffective. By combining drugs with different therapeutic targets, we can achieve our treatment goals.
In tobacco dependence treatment, we use combination pharmacotherapy for heavier smokers or for those who have tried and failed to quit previously. Interestingly, lorcaserin has been demonstrated in another pilot study to increase smoking cessation rates by itself.
The combination of lorcaserin and varenicline holds promise for the treatment of tobacco dependence by attacking tobacco dependence through two different mechanisms and preventing PCWG, which may prevent relapse back to smoking.
Dr. Ebbert is professor of medicine, a general internist at the Mayo Clinic in Rochester, Minn., and a diplomate of the American Board of Addiction Medicine. The opinions expressed are those of the author and do not necessarily represent the views and opinions of the Mayo Clinic. The opinions expressed in this article should not be used to diagnose or treat any medical condition, nor should they be used as a substitute for medical advice from a qualified, board-certified practicing clinician. Dr. Ebbert has no relevant financial disclosures about this article.
Second transplant, consolidation don’t add benefit in upfront multiple myeloma therapy
SAN DIEGO – It took a clinical trial with a byzantine design to prove it, but neither posttransplant consolidation therapy nor second transplant offered any additional survival benefits to patients with multiple myeloma, including patients with high-risk disease who were treated with an upfront thalidomide analogue and a proteasome inhibitor, followed by stem cell transplant and lenalidomide maintenance.
Among 758 patients with multiple myeloma who underwent standard induction therapy, followed by melphalan conditioning and autologous stem cell transplant (ASCT), there were no differences in either progression-free survival (PFS) or overall survival (OS) among patients assigned to follow-on therapy with either lenalidomide (Revlimid) maintenance alone; consolidation therapy with four cycles of lenalidomide (Revlimid), bortezomib (Velcade), and dexamethasone (RVD), followed by lenalidomide maintenance; or second transplant, followed by lenalidomide maintenance, reported Edward A. Stadtmauer, MD, coleader of the hematologic malignancies program at the Abramson Cancer Center, and chief of the section of hematologic malignancies, University of Pennsylvania, Philadelphia.
Investigators in the STAMINA (Stem Cell Transplant With Lenalidomide Maintenance in Patients With Multiple Myeloma) trial (BMT CTN 0702) hypothesized that the use of thalidomide analogues and proteasome inhibitors used in first-line therapy, consolidation, and long-term maintenance after high-dose melphalan and ASCT would improve survival, compared with a second ASCT.
To test this idea, they enrolled 758 patients and randomized them to one of the three aforementioned posttransplant strategies prior to transplant conditioning with high-dose melphalan (200 mg/m2) and ASCT.
Roughly 25% of patients in each treatment arm had high-risk disease, defined as beta2 microglobulin levels greater than 5.5 mg/L, high-risk cytogenetics, and deletion 13 detected by standard cytogenetics only. The remaining patients in each arm had standard-risk disease.
Slightly more than half of patients received RVD upfront; about 13% received cyclophosphamide, bortezomib, and dexamethasone (CyBorD); roughly 10% received lenalidomide dexamethasone; 12% were treated with bortezomib/dexamethasone; and about 8% received other, unspecified combinations.
At a median follow-up time of 37.8 months, the PFS rate, which was the primary endpoint, was 56.5% for the second transplant arm, 56.7% for the RVD arm, and 52.2% for the maintenance-only arm. The differences were not statistically significant.
Similarly, there were no among-arm differences in PFS for patients with standard-risk disease (60.9%, 59.5%, and 55.9%) or for those with high-risk myeloma (42.2%, 48.3%, and 40.2%)
Overall survival, a secondary endpoint, was also not significantly different among the groups, at 82%, 85.7%, and 83.4%, respectively.
Encouragingly, however, despite lower PFS rates, patients with high-risk disease had high OS rates, with 79.6% of patients in the double-transplant arm, 77.5% of those in the RVD consolidation arm, and 79.5% of those in the lenalidomide maintenance-alone arm still alive at 38 months.
Secondary malignancies occurred among 5.1% of patients overall: 14 in the dual-transplant arm, 15 in the consolidation arm, and 10 in the maintenance-only arm. The most frequently reported second malignancies were leukemia, which occurred in 3 of 14 patients with second cancers after second transplant and in 9 of 15 patients with second cancers after consolidation, and solid tumors, which occurred most frequently among second cancers in the maintenance arm.
The investigators are continuing to parse the data by study arm to see whether response assessment correlates with outcomes and with complete remissions. They also plan to examine minimal residual disease via flow cytometry and sequencing, and to obtain long-term data on survival, toxicities, and second primary malignancies.
The trial was funded by the National Institutes of Health with support from Celgene and Millennium/Takeda. Dr. Stadtmauer disclosed consulting for Takeda and travel expenses from Celgene.
SAN DIEGO – It took a clinical trial with a byzantine design to prove it, but neither posttransplant consolidation therapy nor second transplant offered any additional survival benefits to patients with multiple myeloma, including patients with high-risk disease who were treated with an upfront thalidomide analogue and a proteasome inhibitor, followed by stem cell transplant and lenalidomide maintenance.
Among 758 patients with multiple myeloma who underwent standard induction therapy, followed by melphalan conditioning and autologous stem cell transplant (ASCT), there were no differences in either progression-free survival (PFS) or overall survival (OS) among patients assigned to follow-on therapy with either lenalidomide (Revlimid) maintenance alone; consolidation therapy with four cycles of lenalidomide (Revlimid), bortezomib (Velcade), and dexamethasone (RVD), followed by lenalidomide maintenance; or second transplant, followed by lenalidomide maintenance, reported Edward A. Stadtmauer, MD, coleader of the hematologic malignancies program at the Abramson Cancer Center, and chief of the section of hematologic malignancies, University of Pennsylvania, Philadelphia.
Investigators in the STAMINA (Stem Cell Transplant With Lenalidomide Maintenance in Patients With Multiple Myeloma) trial (BMT CTN 0702) hypothesized that the use of thalidomide analogues and proteasome inhibitors used in first-line therapy, consolidation, and long-term maintenance after high-dose melphalan and ASCT would improve survival, compared with a second ASCT.
To test this idea, they enrolled 758 patients and randomized them to one of the three aforementioned posttransplant strategies prior to transplant conditioning with high-dose melphalan (200 mg/m2) and ASCT.
Roughly 25% of patients in each treatment arm had high-risk disease, defined as beta2 microglobulin levels greater than 5.5 mg/L, high-risk cytogenetics, and deletion 13 detected by standard cytogenetics only. The remaining patients in each arm had standard-risk disease.
Slightly more than half of patients received RVD upfront; about 13% received cyclophosphamide, bortezomib, and dexamethasone (CyBorD); roughly 10% received lenalidomide dexamethasone; 12% were treated with bortezomib/dexamethasone; and about 8% received other, unspecified combinations.
At a median follow-up time of 37.8 months, the PFS rate, which was the primary endpoint, was 56.5% for the second transplant arm, 56.7% for the RVD arm, and 52.2% for the maintenance-only arm. The differences were not statistically significant.
Similarly, there were no among-arm differences in PFS for patients with standard-risk disease (60.9%, 59.5%, and 55.9%) or for those with high-risk myeloma (42.2%, 48.3%, and 40.2%)
Overall survival, a secondary endpoint, was also not significantly different among the groups, at 82%, 85.7%, and 83.4%, respectively.
Encouragingly, however, despite lower PFS rates, patients with high-risk disease had high OS rates, with 79.6% of patients in the double-transplant arm, 77.5% of those in the RVD consolidation arm, and 79.5% of those in the lenalidomide maintenance-alone arm still alive at 38 months.
Secondary malignancies occurred among 5.1% of patients overall: 14 in the dual-transplant arm, 15 in the consolidation arm, and 10 in the maintenance-only arm. The most frequently reported second malignancies were leukemia, which occurred in 3 of 14 patients with second cancers after second transplant and in 9 of 15 patients with second cancers after consolidation, and solid tumors, which occurred most frequently among second cancers in the maintenance arm.
The investigators are continuing to parse the data by study arm to see whether response assessment correlates with outcomes and with complete remissions. They also plan to examine minimal residual disease via flow cytometry and sequencing, and to obtain long-term data on survival, toxicities, and second primary malignancies.
The trial was funded by the National Institutes of Health with support from Celgene and Millennium/Takeda. Dr. Stadtmauer disclosed consulting for Takeda and travel expenses from Celgene.
SAN DIEGO – It took a clinical trial with a byzantine design to prove it, but neither posttransplant consolidation therapy nor second transplant offered any additional survival benefits to patients with multiple myeloma, including patients with high-risk disease who were treated with an upfront thalidomide analogue and a proteasome inhibitor, followed by stem cell transplant and lenalidomide maintenance.
Among 758 patients with multiple myeloma who underwent standard induction therapy, followed by melphalan conditioning and autologous stem cell transplant (ASCT), there were no differences in either progression-free survival (PFS) or overall survival (OS) among patients assigned to follow-on therapy with either lenalidomide (Revlimid) maintenance alone; consolidation therapy with four cycles of lenalidomide (Revlimid), bortezomib (Velcade), and dexamethasone (RVD), followed by lenalidomide maintenance; or second transplant, followed by lenalidomide maintenance, reported Edward A. Stadtmauer, MD, coleader of the hematologic malignancies program at the Abramson Cancer Center, and chief of the section of hematologic malignancies, University of Pennsylvania, Philadelphia.
Investigators in the STAMINA (Stem Cell Transplant With Lenalidomide Maintenance in Patients With Multiple Myeloma) trial (BMT CTN 0702) hypothesized that the use of thalidomide analogues and proteasome inhibitors used in first-line therapy, consolidation, and long-term maintenance after high-dose melphalan and ASCT would improve survival, compared with a second ASCT.
To test this idea, they enrolled 758 patients and randomized them to one of the three aforementioned posttransplant strategies prior to transplant conditioning with high-dose melphalan (200 mg/m2) and ASCT.
Roughly 25% of patients in each treatment arm had high-risk disease, defined as beta2 microglobulin levels greater than 5.5 mg/L, high-risk cytogenetics, and deletion 13 detected by standard cytogenetics only. The remaining patients in each arm had standard-risk disease.
Slightly more than half of patients received RVD upfront; about 13% received cyclophosphamide, bortezomib, and dexamethasone (CyBorD); roughly 10% received lenalidomide dexamethasone; 12% were treated with bortezomib/dexamethasone; and about 8% received other, unspecified combinations.
At a median follow-up time of 37.8 months, the PFS rate, which was the primary endpoint, was 56.5% for the second transplant arm, 56.7% for the RVD arm, and 52.2% for the maintenance-only arm. The differences were not statistically significant.
Similarly, there were no among-arm differences in PFS for patients with standard-risk disease (60.9%, 59.5%, and 55.9%) or for those with high-risk myeloma (42.2%, 48.3%, and 40.2%)
Overall survival, a secondary endpoint, was also not significantly different among the groups, at 82%, 85.7%, and 83.4%, respectively.
Encouragingly, however, despite lower PFS rates, patients with high-risk disease had high OS rates, with 79.6% of patients in the double-transplant arm, 77.5% of those in the RVD consolidation arm, and 79.5% of those in the lenalidomide maintenance-alone arm still alive at 38 months.
Secondary malignancies occurred among 5.1% of patients overall: 14 in the dual-transplant arm, 15 in the consolidation arm, and 10 in the maintenance-only arm. The most frequently reported second malignancies were leukemia, which occurred in 3 of 14 patients with second cancers after second transplant and in 9 of 15 patients with second cancers after consolidation, and solid tumors, which occurred most frequently among second cancers in the maintenance arm.
The investigators are continuing to parse the data by study arm to see whether response assessment correlates with outcomes and with complete remissions. They also plan to examine minimal residual disease via flow cytometry and sequencing, and to obtain long-term data on survival, toxicities, and second primary malignancies.
The trial was funded by the National Institutes of Health with support from Celgene and Millennium/Takeda. Dr. Stadtmauer disclosed consulting for Takeda and travel expenses from Celgene.
AT ASH 2016
Key clinical point: Three posttransplant strategies for patients with previously untreated myeloma were comparably effective.
Major finding: There were no differences in PFS or OS among patients treated with upfront therapy and transplant followed by either second transplant, consolidation, or lenalidomide maintenance alone.
Data source: Randomized prospective trial of 758 patients with multiple myeloma treated with a thalidomide analogue, proteasome inhibitor, and autologous stem cell transplant.
Disclosures: The trial was funded by the National Institutes of Health with support from Celgene and Millennium/Takeda. Dr. Stadtmauer disclosed consulting for Takeda and travel expenses from Celgene.
The war on pain
When your peer group is dominated by folks in their early 70s, conversations at dinner parties and lobster bakes invariably morph into storytelling competitions between the survivors of recent hospitalizations and medical procedures. I try to redirect this tedious and repetitive chatter with a topic from my standard collection of conversation re-starters that includes “How about those Red Sox?” and “How’s your granddaughter’s soccer season going?” But sadly I am not always successful.
Often embedded in these tales of medical misadventure are stories of unfortunate experiences with pain medications. Sometimes the story includes a description of how prescribed pain medication created symptoms that were far worse than the pain it was intended to treat. Vomiting, constipation, and “feeling goofy” are high on the list of complaints.
These caches of unused opioids, many of which were never needed in the first place, are evidence of why our health care has become so expensive, and also represent the seeds from which the addiction epidemic has grown. Ironically, they also are collateral damage from an unsuccessful and sometimes misguided war on pain.
It isn’t clear exactly when or where the war on pain began, but I’m sure those who fired the first shots were understandably concerned that many patients with incurable and terminal conditions were suffering needlessly because their pain was being under-treated. Coincidently came the realization that the sooner we could get postoperative patients on their feet and taking deep breaths, the fewer complications we would see. And the more adequately we treated their pain, the sooner we could get those patients moving and breathing optimally.
In a good faith effort to be more “scientific” about pain management, patients were asked to rate their pain and smiley face charts appeared. Unfortunately, somewhere along the line came the mantra that not only should no patient’s pain go unmeasured, but no patient’s pain should go unmedicated.
The federal government entered the war when the Centers for Medicare & Medicaid Services issued the directive that hospitals ask patients who were being discharged if their pain had been well controlled and how often did the hospital staff do what they could to ease their pain? The answers to these questions, along with others, was collected and used in assessing a hospital’s quality of care and determining its level of reimbursement.
So far, there is insufficient data to determine how frequently this directive on pain management induced hospitals to over-prescribe medication, but it certainly hasn’t been associated with a decline in opioid abuse. It is reasonable to suspect that this salvo by the government has resulted in some collateral damage as it encouraged a steady flow of unused and unnecessary prescription narcotics out of the hospital and on to the streets.
The good news is that there has been enough concern voiced about the unintended effect of these pain management questions that the CMS has decided to eliminate financial pressure clinicians might feel to over-prescribe medications by withdrawing the questions from the patient discharge questionnaire.
The bad news is that we continue to fight the war on pain with a limited arsenal. As long as clinicians simply believe that no pain should go unmedicated, they will continue to miss opportunities to use other modalities such as counseling, physical therapy, and education that can be effective without the risk of collateral damage. Instead of asking the patient (who may not know the answer), we should be asking ourselves if we have been doing everything we could to help the patient deal with his pain. The answer is often not written on prescription pads.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics including “How to Say No to Your Toddler.”
When your peer group is dominated by folks in their early 70s, conversations at dinner parties and lobster bakes invariably morph into storytelling competitions between the survivors of recent hospitalizations and medical procedures. I try to redirect this tedious and repetitive chatter with a topic from my standard collection of conversation re-starters that includes “How about those Red Sox?” and “How’s your granddaughter’s soccer season going?” But sadly I am not always successful.
Often embedded in these tales of medical misadventure are stories of unfortunate experiences with pain medications. Sometimes the story includes a description of how prescribed pain medication created symptoms that were far worse than the pain it was intended to treat. Vomiting, constipation, and “feeling goofy” are high on the list of complaints.
These caches of unused opioids, many of which were never needed in the first place, are evidence of why our health care has become so expensive, and also represent the seeds from which the addiction epidemic has grown. Ironically, they also are collateral damage from an unsuccessful and sometimes misguided war on pain.
It isn’t clear exactly when or where the war on pain began, but I’m sure those who fired the first shots were understandably concerned that many patients with incurable and terminal conditions were suffering needlessly because their pain was being under-treated. Coincidently came the realization that the sooner we could get postoperative patients on their feet and taking deep breaths, the fewer complications we would see. And the more adequately we treated their pain, the sooner we could get those patients moving and breathing optimally.
In a good faith effort to be more “scientific” about pain management, patients were asked to rate their pain and smiley face charts appeared. Unfortunately, somewhere along the line came the mantra that not only should no patient’s pain go unmeasured, but no patient’s pain should go unmedicated.
The federal government entered the war when the Centers for Medicare & Medicaid Services issued the directive that hospitals ask patients who were being discharged if their pain had been well controlled and how often did the hospital staff do what they could to ease their pain? The answers to these questions, along with others, was collected and used in assessing a hospital’s quality of care and determining its level of reimbursement.
So far, there is insufficient data to determine how frequently this directive on pain management induced hospitals to over-prescribe medication, but it certainly hasn’t been associated with a decline in opioid abuse. It is reasonable to suspect that this salvo by the government has resulted in some collateral damage as it encouraged a steady flow of unused and unnecessary prescription narcotics out of the hospital and on to the streets.
The good news is that there has been enough concern voiced about the unintended effect of these pain management questions that the CMS has decided to eliminate financial pressure clinicians might feel to over-prescribe medications by withdrawing the questions from the patient discharge questionnaire.
The bad news is that we continue to fight the war on pain with a limited arsenal. As long as clinicians simply believe that no pain should go unmedicated, they will continue to miss opportunities to use other modalities such as counseling, physical therapy, and education that can be effective without the risk of collateral damage. Instead of asking the patient (who may not know the answer), we should be asking ourselves if we have been doing everything we could to help the patient deal with his pain. The answer is often not written on prescription pads.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics including “How to Say No to Your Toddler.”
When your peer group is dominated by folks in their early 70s, conversations at dinner parties and lobster bakes invariably morph into storytelling competitions between the survivors of recent hospitalizations and medical procedures. I try to redirect this tedious and repetitive chatter with a topic from my standard collection of conversation re-starters that includes “How about those Red Sox?” and “How’s your granddaughter’s soccer season going?” But sadly I am not always successful.
Often embedded in these tales of medical misadventure are stories of unfortunate experiences with pain medications. Sometimes the story includes a description of how prescribed pain medication created symptoms that were far worse than the pain it was intended to treat. Vomiting, constipation, and “feeling goofy” are high on the list of complaints.
These caches of unused opioids, many of which were never needed in the first place, are evidence of why our health care has become so expensive, and also represent the seeds from which the addiction epidemic has grown. Ironically, they also are collateral damage from an unsuccessful and sometimes misguided war on pain.
It isn’t clear exactly when or where the war on pain began, but I’m sure those who fired the first shots were understandably concerned that many patients with incurable and terminal conditions were suffering needlessly because their pain was being under-treated. Coincidently came the realization that the sooner we could get postoperative patients on their feet and taking deep breaths, the fewer complications we would see. And the more adequately we treated their pain, the sooner we could get those patients moving and breathing optimally.
In a good faith effort to be more “scientific” about pain management, patients were asked to rate their pain and smiley face charts appeared. Unfortunately, somewhere along the line came the mantra that not only should no patient’s pain go unmeasured, but no patient’s pain should go unmedicated.
The federal government entered the war when the Centers for Medicare & Medicaid Services issued the directive that hospitals ask patients who were being discharged if their pain had been well controlled and how often did the hospital staff do what they could to ease their pain? The answers to these questions, along with others, was collected and used in assessing a hospital’s quality of care and determining its level of reimbursement.
So far, there is insufficient data to determine how frequently this directive on pain management induced hospitals to over-prescribe medication, but it certainly hasn’t been associated with a decline in opioid abuse. It is reasonable to suspect that this salvo by the government has resulted in some collateral damage as it encouraged a steady flow of unused and unnecessary prescription narcotics out of the hospital and on to the streets.
The good news is that there has been enough concern voiced about the unintended effect of these pain management questions that the CMS has decided to eliminate financial pressure clinicians might feel to over-prescribe medications by withdrawing the questions from the patient discharge questionnaire.
The bad news is that we continue to fight the war on pain with a limited arsenal. As long as clinicians simply believe that no pain should go unmedicated, they will continue to miss opportunities to use other modalities such as counseling, physical therapy, and education that can be effective without the risk of collateral damage. Instead of asking the patient (who may not know the answer), we should be asking ourselves if we have been doing everything we could to help the patient deal with his pain. The answer is often not written on prescription pads.
Dr. Wilkoff practiced primary care pediatrics in Brunswick, Maine, for nearly 40 years. He has authored several books on behavioral pediatrics including “How to Say No to Your Toddler.”
VIDEO: Ankylosing spondylitis problems outside the joints strike more women than men
WASHINGTON – Women are almost twice as likely as men to develop extra-articular manifestations of ankylosing spondylitis such as uveitis and inflammatory bowel disease, according to an analysis of patients in the Ankylosing Spondylitis Registry of Ireland.
Each of those manifestations exerts its own difficulties upon patients over and above the inflammatory back pain of the underlying disease, Gillian Fitzgerald, MD, said at the annual meeting of the American College of Rheumatology. Many patients can develop several of these separate manifestations – a circumstance that seriously affects their quality of life.
The findings of the large registry study were a bit surprising, she said during presentation of the study at a press briefing, as ankylosing spondylitis is generally thought to affect largely men. “However, this isn’t the case,” said Dr. Fitzgerald of St. James’s Hospital, Dublin. “Recent studies show that women can be affected as often as men are.”
In light of those findings, Dr. Fitzgerald and her coauthors wanted to further define the gender differences, especially with regard to extra-articular manifestations.
They accessed data on 564 patients in the registry, which was established in 2013. The majority of patients (78%) were men; the mean age was 47 years. Patients had a mean disease duration of nearly 21 years. For almost half that time (9 years) they had remained undiagnosed, Dr. Fitzgerald added. They had a mean age of about 47 years, and 78% fulfilled the modified New York criteria for ankylosing spondylitis.
Overall, extra-articular manifestations were common, with 35% having uveitis, 18% psoriasis, and 10% inflammatory bowel disease.
Uveitis was significantly more common among women (47% vs. 32%) and among those with disease duration of more than 10 years (40% vs. 22% with less than 10 years).
Inflammatory bowel disease was also significantly more common among women (16.5% vs. 8%). It wasn’t related to disease duration, but it was related to elevated baseline C-reactive protein, peptic ulcer disease, and osteoporosis.
In a multivariate regression analysis, women were 70% more likely to experience an extra-articular manifestation of the disease than were men (hazard ratio, 1.7). Having the disease for more than 10 years more than doubled the risk of an extra-articular manifestation (HR, 2.4).
Dr. Fitzgerald discussed the study’s findings in a video interview at the meeting. She had no financial disclosures.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
[email protected]
On Twitter @alz_gal
WASHINGTON – Women are almost twice as likely as men to develop extra-articular manifestations of ankylosing spondylitis such as uveitis and inflammatory bowel disease, according to an analysis of patients in the Ankylosing Spondylitis Registry of Ireland.
Each of those manifestations exerts its own difficulties upon patients over and above the inflammatory back pain of the underlying disease, Gillian Fitzgerald, MD, said at the annual meeting of the American College of Rheumatology. Many patients can develop several of these separate manifestations – a circumstance that seriously affects their quality of life.
The findings of the large registry study were a bit surprising, she said during presentation of the study at a press briefing, as ankylosing spondylitis is generally thought to affect largely men. “However, this isn’t the case,” said Dr. Fitzgerald of St. James’s Hospital, Dublin. “Recent studies show that women can be affected as often as men are.”
In light of those findings, Dr. Fitzgerald and her coauthors wanted to further define the gender differences, especially with regard to extra-articular manifestations.
They accessed data on 564 patients in the registry, which was established in 2013. The majority of patients (78%) were men; the mean age was 47 years. Patients had a mean disease duration of nearly 21 years. For almost half that time (9 years) they had remained undiagnosed, Dr. Fitzgerald added. They had a mean age of about 47 years, and 78% fulfilled the modified New York criteria for ankylosing spondylitis.
Overall, extra-articular manifestations were common, with 35% having uveitis, 18% psoriasis, and 10% inflammatory bowel disease.
Uveitis was significantly more common among women (47% vs. 32%) and among those with disease duration of more than 10 years (40% vs. 22% with less than 10 years).
Inflammatory bowel disease was also significantly more common among women (16.5% vs. 8%). It wasn’t related to disease duration, but it was related to elevated baseline C-reactive protein, peptic ulcer disease, and osteoporosis.
In a multivariate regression analysis, women were 70% more likely to experience an extra-articular manifestation of the disease than were men (hazard ratio, 1.7). Having the disease for more than 10 years more than doubled the risk of an extra-articular manifestation (HR, 2.4).
Dr. Fitzgerald discussed the study’s findings in a video interview at the meeting. She had no financial disclosures.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
[email protected]
On Twitter @alz_gal
WASHINGTON – Women are almost twice as likely as men to develop extra-articular manifestations of ankylosing spondylitis such as uveitis and inflammatory bowel disease, according to an analysis of patients in the Ankylosing Spondylitis Registry of Ireland.
Each of those manifestations exerts its own difficulties upon patients over and above the inflammatory back pain of the underlying disease, Gillian Fitzgerald, MD, said at the annual meeting of the American College of Rheumatology. Many patients can develop several of these separate manifestations – a circumstance that seriously affects their quality of life.
The findings of the large registry study were a bit surprising, she said during presentation of the study at a press briefing, as ankylosing spondylitis is generally thought to affect largely men. “However, this isn’t the case,” said Dr. Fitzgerald of St. James’s Hospital, Dublin. “Recent studies show that women can be affected as often as men are.”
In light of those findings, Dr. Fitzgerald and her coauthors wanted to further define the gender differences, especially with regard to extra-articular manifestations.
They accessed data on 564 patients in the registry, which was established in 2013. The majority of patients (78%) were men; the mean age was 47 years. Patients had a mean disease duration of nearly 21 years. For almost half that time (9 years) they had remained undiagnosed, Dr. Fitzgerald added. They had a mean age of about 47 years, and 78% fulfilled the modified New York criteria for ankylosing spondylitis.
Overall, extra-articular manifestations were common, with 35% having uveitis, 18% psoriasis, and 10% inflammatory bowel disease.
Uveitis was significantly more common among women (47% vs. 32%) and among those with disease duration of more than 10 years (40% vs. 22% with less than 10 years).
Inflammatory bowel disease was also significantly more common among women (16.5% vs. 8%). It wasn’t related to disease duration, but it was related to elevated baseline C-reactive protein, peptic ulcer disease, and osteoporosis.
In a multivariate regression analysis, women were 70% more likely to experience an extra-articular manifestation of the disease than were men (hazard ratio, 1.7). Having the disease for more than 10 years more than doubled the risk of an extra-articular manifestation (HR, 2.4).
Dr. Fitzgerald discussed the study’s findings in a video interview at the meeting. She had no financial disclosures.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
[email protected]
On Twitter @alz_gal
AT THE ACR ANNUAL MEETING
Key clinical point:
Major finding: Women were 70% more likely than men to develop an extra-articular manifestation of the disease.
Data source: The registry study comprised 564 patients.
Disclosures: Dr. Fitzgerald had no financial disclosures.
Ketogenic diet, with variations, can help adult epilepsy
HOUSTON – The ketogenic diet is usually thought of as a solution of near-last resort for pediatric epilepsy, but some adolescents and adults with epilepsy can also benefit from a very low carbohydrate diet.
There are also limited data to suggest that more palatable adaptations of the diet may provide benefit while improving adherence, said Mackenzie C. Cervenka, MD, speaking at the annual meeting of the American Epilepsy Society.
“Ketogenic diets are a reasonable option for older adolescents and adults with drug-resistant epilepsy that’s not amenable to surgical intervention,” said Dr. Cervenka, director of the Adult Epilepsy Diet Center at Johns Hopkins University, Baltimore.
The antiepileptic benefit of a diet that induces ketogenesis, forcing the brain to utilize ketone bodies rather than glucose for energy, has been known since the 1920s, with benefit seen for adolescents and adults in studies completed in the 1930s. These diets mimic a starvation state, but provide enough calories through fat or protein to maintain weight. Calories in the traditional ketogenic diet, Dr. Cervenka said, are about 90% fat. Food for patients on this diet should be weighed on a gram scale, and those preparing meals should aim for a ratio of 3 to 4 grams of fat for each gram of carbohydrate and protein combined. A modified version uses a 1:1 or 2:1 ratio, a more appealing configuration for some patients.
Weighing each bite of food is cumbersome, and palatability can be a major problem as well, contributing to adherence problems with such a high-fat diet. However, for patients who are so ill that they are tube-fed, commercially available ketogenic formulas are available. Necessary supplementation on a traditional ketogenic diet includes calcium, vitamin D, multivitamins, and oral citrates to prevent kidney stone formation, she said.
However, ketogenic diets are known to be anti-inflammatory: Animal models have shown less inflammation, pain, and fever when rats are fed a ketogenic diet. Also, proinflammatory cytokines and chemokines are reduced on a ketogenic diet in a rodent model of Parkinson’s disease and multiple sclerosis. Particularly for patients with autoimmune encephalopathies, the ketogenic diet has been shown to be of benefit.
One option with promising, but limited, results is a low-carbohydrate diet rich in medium-chain triglycerides. Medium chain triglyceride (MCT) oil is available in a commercial preparation derived from coconut or palm kernel oil. On this diet, 30%-60% of calories should come from MCTs, which is usually sufficient to induce ketosis. However, gastrointestinal side effects such as bloating and diarrhea can be pronounced, especially if the diet is begun abruptly. It’s best to ramp up slowly with MCTs, so this diet would not be appropriate for the patient who needs quick improvement in seizure control, Dr. Cervenka said.
A modified Atkins diet provides 15-20 grams of net carbohydrates daily, after dietary fiber is subtracted. Using this strategy, adolescents and adults don’t have to weigh foods. Rather, food tables are used to track carbohydrates and fiber, and ketosis is assessed by measuring urine ketones on a test strip. The goal, Dr. Cervenka said, is to achieve moderate to large ketosis (40-160 ng/mL urine ketones).
Finally, low glycemic index treatment (LGIT) is an option worth considering. This diet takes advantage of certain carbohydrate-rich foods that do not raise blood sugar quickly, such as fiber-rich vegetables or legumes with some fat content. Patients on the LGIT diet can have from 40 to 60 grams of carbohydrate daily, and the diet has been used with some success in drug-resistant childhood epilepsy as well as Angelman syndrome, she said.
Though sample sizes are small and efficacy may be modest, Dr. Cervenka said, “the effect is quick.” Finding less-restrictive modifications of these diets may help patients stay on the diet over the long term, increasing real-world effectiveness.
But long-term adherence to a ketogenic or other high-fat, low-residue diet comes with a host of unknowns about cardiovascular, metabolic, and renal health; ongoing study of these patients may yield answers about whether theoretical concerns are borne out, and whether the risk is worth it in terms of seizure benefit.
Dr. Cervenka reported receiving grant support from Nutricia North America, Vitaflo, and the BrightFocus Foundation. She has also received an honorarium for speaking from LivaNova.
[email protected]
On Twitter @karioakes
HOUSTON – The ketogenic diet is usually thought of as a solution of near-last resort for pediatric epilepsy, but some adolescents and adults with epilepsy can also benefit from a very low carbohydrate diet.
There are also limited data to suggest that more palatable adaptations of the diet may provide benefit while improving adherence, said Mackenzie C. Cervenka, MD, speaking at the annual meeting of the American Epilepsy Society.
“Ketogenic diets are a reasonable option for older adolescents and adults with drug-resistant epilepsy that’s not amenable to surgical intervention,” said Dr. Cervenka, director of the Adult Epilepsy Diet Center at Johns Hopkins University, Baltimore.
The antiepileptic benefit of a diet that induces ketogenesis, forcing the brain to utilize ketone bodies rather than glucose for energy, has been known since the 1920s, with benefit seen for adolescents and adults in studies completed in the 1930s. These diets mimic a starvation state, but provide enough calories through fat or protein to maintain weight. Calories in the traditional ketogenic diet, Dr. Cervenka said, are about 90% fat. Food for patients on this diet should be weighed on a gram scale, and those preparing meals should aim for a ratio of 3 to 4 grams of fat for each gram of carbohydrate and protein combined. A modified version uses a 1:1 or 2:1 ratio, a more appealing configuration for some patients.
Weighing each bite of food is cumbersome, and palatability can be a major problem as well, contributing to adherence problems with such a high-fat diet. However, for patients who are so ill that they are tube-fed, commercially available ketogenic formulas are available. Necessary supplementation on a traditional ketogenic diet includes calcium, vitamin D, multivitamins, and oral citrates to prevent kidney stone formation, she said.
However, ketogenic diets are known to be anti-inflammatory: Animal models have shown less inflammation, pain, and fever when rats are fed a ketogenic diet. Also, proinflammatory cytokines and chemokines are reduced on a ketogenic diet in a rodent model of Parkinson’s disease and multiple sclerosis. Particularly for patients with autoimmune encephalopathies, the ketogenic diet has been shown to be of benefit.
One option with promising, but limited, results is a low-carbohydrate diet rich in medium-chain triglycerides. Medium chain triglyceride (MCT) oil is available in a commercial preparation derived from coconut or palm kernel oil. On this diet, 30%-60% of calories should come from MCTs, which is usually sufficient to induce ketosis. However, gastrointestinal side effects such as bloating and diarrhea can be pronounced, especially if the diet is begun abruptly. It’s best to ramp up slowly with MCTs, so this diet would not be appropriate for the patient who needs quick improvement in seizure control, Dr. Cervenka said.
A modified Atkins diet provides 15-20 grams of net carbohydrates daily, after dietary fiber is subtracted. Using this strategy, adolescents and adults don’t have to weigh foods. Rather, food tables are used to track carbohydrates and fiber, and ketosis is assessed by measuring urine ketones on a test strip. The goal, Dr. Cervenka said, is to achieve moderate to large ketosis (40-160 ng/mL urine ketones).
Finally, low glycemic index treatment (LGIT) is an option worth considering. This diet takes advantage of certain carbohydrate-rich foods that do not raise blood sugar quickly, such as fiber-rich vegetables or legumes with some fat content. Patients on the LGIT diet can have from 40 to 60 grams of carbohydrate daily, and the diet has been used with some success in drug-resistant childhood epilepsy as well as Angelman syndrome, she said.
Though sample sizes are small and efficacy may be modest, Dr. Cervenka said, “the effect is quick.” Finding less-restrictive modifications of these diets may help patients stay on the diet over the long term, increasing real-world effectiveness.
But long-term adherence to a ketogenic or other high-fat, low-residue diet comes with a host of unknowns about cardiovascular, metabolic, and renal health; ongoing study of these patients may yield answers about whether theoretical concerns are borne out, and whether the risk is worth it in terms of seizure benefit.
Dr. Cervenka reported receiving grant support from Nutricia North America, Vitaflo, and the BrightFocus Foundation. She has also received an honorarium for speaking from LivaNova.
[email protected]
On Twitter @karioakes
HOUSTON – The ketogenic diet is usually thought of as a solution of near-last resort for pediatric epilepsy, but some adolescents and adults with epilepsy can also benefit from a very low carbohydrate diet.
There are also limited data to suggest that more palatable adaptations of the diet may provide benefit while improving adherence, said Mackenzie C. Cervenka, MD, speaking at the annual meeting of the American Epilepsy Society.
“Ketogenic diets are a reasonable option for older adolescents and adults with drug-resistant epilepsy that’s not amenable to surgical intervention,” said Dr. Cervenka, director of the Adult Epilepsy Diet Center at Johns Hopkins University, Baltimore.
The antiepileptic benefit of a diet that induces ketogenesis, forcing the brain to utilize ketone bodies rather than glucose for energy, has been known since the 1920s, with benefit seen for adolescents and adults in studies completed in the 1930s. These diets mimic a starvation state, but provide enough calories through fat or protein to maintain weight. Calories in the traditional ketogenic diet, Dr. Cervenka said, are about 90% fat. Food for patients on this diet should be weighed on a gram scale, and those preparing meals should aim for a ratio of 3 to 4 grams of fat for each gram of carbohydrate and protein combined. A modified version uses a 1:1 or 2:1 ratio, a more appealing configuration for some patients.
Weighing each bite of food is cumbersome, and palatability can be a major problem as well, contributing to adherence problems with such a high-fat diet. However, for patients who are so ill that they are tube-fed, commercially available ketogenic formulas are available. Necessary supplementation on a traditional ketogenic diet includes calcium, vitamin D, multivitamins, and oral citrates to prevent kidney stone formation, she said.
However, ketogenic diets are known to be anti-inflammatory: Animal models have shown less inflammation, pain, and fever when rats are fed a ketogenic diet. Also, proinflammatory cytokines and chemokines are reduced on a ketogenic diet in a rodent model of Parkinson’s disease and multiple sclerosis. Particularly for patients with autoimmune encephalopathies, the ketogenic diet has been shown to be of benefit.
One option with promising, but limited, results is a low-carbohydrate diet rich in medium-chain triglycerides. Medium chain triglyceride (MCT) oil is available in a commercial preparation derived from coconut or palm kernel oil. On this diet, 30%-60% of calories should come from MCTs, which is usually sufficient to induce ketosis. However, gastrointestinal side effects such as bloating and diarrhea can be pronounced, especially if the diet is begun abruptly. It’s best to ramp up slowly with MCTs, so this diet would not be appropriate for the patient who needs quick improvement in seizure control, Dr. Cervenka said.
A modified Atkins diet provides 15-20 grams of net carbohydrates daily, after dietary fiber is subtracted. Using this strategy, adolescents and adults don’t have to weigh foods. Rather, food tables are used to track carbohydrates and fiber, and ketosis is assessed by measuring urine ketones on a test strip. The goal, Dr. Cervenka said, is to achieve moderate to large ketosis (40-160 ng/mL urine ketones).
Finally, low glycemic index treatment (LGIT) is an option worth considering. This diet takes advantage of certain carbohydrate-rich foods that do not raise blood sugar quickly, such as fiber-rich vegetables or legumes with some fat content. Patients on the LGIT diet can have from 40 to 60 grams of carbohydrate daily, and the diet has been used with some success in drug-resistant childhood epilepsy as well as Angelman syndrome, she said.
Though sample sizes are small and efficacy may be modest, Dr. Cervenka said, “the effect is quick.” Finding less-restrictive modifications of these diets may help patients stay on the diet over the long term, increasing real-world effectiveness.
But long-term adherence to a ketogenic or other high-fat, low-residue diet comes with a host of unknowns about cardiovascular, metabolic, and renal health; ongoing study of these patients may yield answers about whether theoretical concerns are borne out, and whether the risk is worth it in terms of seizure benefit.
Dr. Cervenka reported receiving grant support from Nutricia North America, Vitaflo, and the BrightFocus Foundation. She has also received an honorarium for speaking from LivaNova.
[email protected]
On Twitter @karioakes
EXPERT ANALYSIS FROM AES 2016
Dementia prevalence increased in heart failure patients
NEW ORLEANS – Elderly patients with heart failure had a significantly increased prevalence of both dementia and mild cognitive impairment, compared with similar people without heart failure, in an analysis of data collected from more than 6,000 U.S. residents enrolled in a long-term observational study.
Patients diagnosed with either heart failure with reduced ejection fraction or heart failure with preserved ejection fraction had an 89% increased prevalence of dementia and a 41% increased prevalence of mild cognitive impairment (MCI), compared with people from the same cohort who did not develop heart failure, in an analysis that adjusted for several demographic and clinical variables, Lucy S. Witt, MD, reported at the American Heart Association scientific sessions. She speculated that the link between heart failure and dementia and MCI might result from impaired cerebral perfusion in heart failure patients or from effects from heart failure medications.
The analysis used data collected for the Atherosclerosis Risk in Communities (ARIC) study, which began in 1987 and enrolled a randomly selected representative cohort of nearly 16,000 women and men aged 45-64 years old who resided in any of four U.S. communities. She specifically focused on the data collected from 6,431 of the participants who returned for a fifth follow-up examination during 2011-2013, including 5,490 people without heart failure, whose average age was 76 years, and 941 participants with heart failure, whose average age was 78 years.
Dementia prevalence at the fifth follow-up visit occurred at an adjusted rate of 5.6% among those without heart failure and 7.0% in those with heart failure. The examinations also found MCI in an adjusted 21.5% of those without heart failure and in 26.2% of those with heart failure, Dr. Witt reported. Adjustments included age, sex, location, education, hypertension, diabetes, depression, alcohol and tobacco use, cerebral vascular disease, marital status, and several other factors.
The relative risk for having dementia among the heart failure patients was roughly similar, regardless of whether ARIC participants with heart failure had a reduced or preserved left ventricular ejection fraction, she said.
ARIC is funded by the National Heart, Lung, and Blood Institute. Dr. Witt had no disclosures.
[email protected]
On Twitter @mitchelzoler
NEW ORLEANS – Elderly patients with heart failure had a significantly increased prevalence of both dementia and mild cognitive impairment, compared with similar people without heart failure, in an analysis of data collected from more than 6,000 U.S. residents enrolled in a long-term observational study.
Patients diagnosed with either heart failure with reduced ejection fraction or heart failure with preserved ejection fraction had an 89% increased prevalence of dementia and a 41% increased prevalence of mild cognitive impairment (MCI), compared with people from the same cohort who did not develop heart failure, in an analysis that adjusted for several demographic and clinical variables, Lucy S. Witt, MD, reported at the American Heart Association scientific sessions. She speculated that the link between heart failure and dementia and MCI might result from impaired cerebral perfusion in heart failure patients or from effects from heart failure medications.
The analysis used data collected for the Atherosclerosis Risk in Communities (ARIC) study, which began in 1987 and enrolled a randomly selected representative cohort of nearly 16,000 women and men aged 45-64 years old who resided in any of four U.S. communities. She specifically focused on the data collected from 6,431 of the participants who returned for a fifth follow-up examination during 2011-2013, including 5,490 people without heart failure, whose average age was 76 years, and 941 participants with heart failure, whose average age was 78 years.
Dementia prevalence at the fifth follow-up visit occurred at an adjusted rate of 5.6% among those without heart failure and 7.0% in those with heart failure. The examinations also found MCI in an adjusted 21.5% of those without heart failure and in 26.2% of those with heart failure, Dr. Witt reported. Adjustments included age, sex, location, education, hypertension, diabetes, depression, alcohol and tobacco use, cerebral vascular disease, marital status, and several other factors.
The relative risk for having dementia among the heart failure patients was roughly similar, regardless of whether ARIC participants with heart failure had a reduced or preserved left ventricular ejection fraction, she said.
ARIC is funded by the National Heart, Lung, and Blood Institute. Dr. Witt had no disclosures.
[email protected]
On Twitter @mitchelzoler
NEW ORLEANS – Elderly patients with heart failure had a significantly increased prevalence of both dementia and mild cognitive impairment, compared with similar people without heart failure, in an analysis of data collected from more than 6,000 U.S. residents enrolled in a long-term observational study.
Patients diagnosed with either heart failure with reduced ejection fraction or heart failure with preserved ejection fraction had an 89% increased prevalence of dementia and a 41% increased prevalence of mild cognitive impairment (MCI), compared with people from the same cohort who did not develop heart failure, in an analysis that adjusted for several demographic and clinical variables, Lucy S. Witt, MD, reported at the American Heart Association scientific sessions. She speculated that the link between heart failure and dementia and MCI might result from impaired cerebral perfusion in heart failure patients or from effects from heart failure medications.
The analysis used data collected for the Atherosclerosis Risk in Communities (ARIC) study, which began in 1987 and enrolled a randomly selected representative cohort of nearly 16,000 women and men aged 45-64 years old who resided in any of four U.S. communities. She specifically focused on the data collected from 6,431 of the participants who returned for a fifth follow-up examination during 2011-2013, including 5,490 people without heart failure, whose average age was 76 years, and 941 participants with heart failure, whose average age was 78 years.
Dementia prevalence at the fifth follow-up visit occurred at an adjusted rate of 5.6% among those without heart failure and 7.0% in those with heart failure. The examinations also found MCI in an adjusted 21.5% of those without heart failure and in 26.2% of those with heart failure, Dr. Witt reported. Adjustments included age, sex, location, education, hypertension, diabetes, depression, alcohol and tobacco use, cerebral vascular disease, marital status, and several other factors.
The relative risk for having dementia among the heart failure patients was roughly similar, regardless of whether ARIC participants with heart failure had a reduced or preserved left ventricular ejection fraction, she said.
ARIC is funded by the National Heart, Lung, and Blood Institute. Dr. Witt had no disclosures.
[email protected]
On Twitter @mitchelzoler
AT THE AHA SCIENTIFIC SESSIONS
Key clinical point:
Major finding: Patients with heart failure had an adjusted 86% increased rate of dementia, compared with similar people without heart failure.
Data source: Analysis of 6,431 ARIC participants who returned for a fifth follow-up examination during 2011-2013.
Disclosures: The Atherosclerosis Risk in Commmunities (ARIC) study is funded by the National Heart, Lung, and Blood Institute. Dr. Witt had no disclosures.
Blood pressure rise follows halting CPAP
Continuous positive airway pressure (CPAP) therapy for obstructive sleep apnea (OSA) has a significant beneficial effect on blood pressure, according to an analysis of participants in three randomized controlled trials.
Previous meta-analyses suggested that CPAP treatment led to an average of improvement of 2-3 mm Hg, but the estimates relied on heterogeneous trials that often had low levels of CPAP adherence, and those factors might have led to an underestimation of the treatment effect. The new analysis showed that halting CPAP increases blood pressure between 5.0 and 9.0 mm Hg, compared with patients who continued using CPAP (Chest. 2016;150[6]:1202-10).
To get around the problem of adherence, researchers led by Malcolm Kohler, MD, at University Hospital of Zürich analyzed the results of three previous studies looking at the effects of CPAP withdrawal. The analysis included 153 OSA patients on CPAP therapy, who had been randomized to continue therapy or to withdraw from therapy for 2 weeks. Eighty-seven of these patients discontinued CPAP, and the remaining 66 patients continued the therapy. Blood pressure was measured at home and in hospital.
On average, those who discontinued CPAP had an increase in office systolic blood pressure of 5.4 mm Hg (95% confidence interval, 1.8-8.9 mm Hg; P = .003) and an increase in home systolic blood pressure of 9.0 mm Hg (95% CI, 5.7-12.3 mm Hg; P less than .001), compared with patients who continued CPAP. The effects of stopping CPAP, instead of continuing the therapy, on office diastolic blood pressure and home diastolic pressure were increases of 5.0 mm Hg (95% CI, 2.7-7.3 mm Hg; P less than .001) and 7.8 mm Hg (95% CI, 5.6-10.0 mm Hg; P less than .001), respectively.
Patients who discontinued CPAP also experienced a significant increase in apnea-hypopnea index, from 2.8/h to 33.2/h, while those who continued using CPAP, on average, experienced only a 0.3/h increase in apnea-hypopnea index from baseline.
“One clinical implication is that if you do not need to stop CPAP for obstructive sleep apnea, do not stop it. This study also suggests the importance of monitoring your blood pressure in a home setting, under usual conditions,” summed up Robert Kloner, MD, PhD, director of the Huntington Medical Research Institutes Cardiovascular Research Lab, Pasadena, Calif., who was not involved in the study.
Previous studies of CPAP, such as the SAVE study published in the New England Journal of Medicine in September (N Engl J Med. 2016;375:919-31), often find little or no connection between CPAP therapy and cardiovascular outcomes. That is probably because of inadequate adherence to CPAP therapy. “That’s always been the bane of sleep apnea studies,” said Krishna M. Sundar, MD, FCCP, who also did not participate in the study.
The current work got around the problem by looking at patients who had already established use of CPAP. “This is a very good study,” said Dr. Sundar, who is the medical director of the Sleep-Wake Center at the University of Utah, Salt Lake City.
The study was funded by the Swiss National Science Foundation and the University of Zürich. The analysis’ authors and the outside experts quoted in this story reported no financial disclosures.
Continuous positive airway pressure (CPAP) therapy for obstructive sleep apnea (OSA) has a significant beneficial effect on blood pressure, according to an analysis of participants in three randomized controlled trials.
Previous meta-analyses suggested that CPAP treatment led to an average of improvement of 2-3 mm Hg, but the estimates relied on heterogeneous trials that often had low levels of CPAP adherence, and those factors might have led to an underestimation of the treatment effect. The new analysis showed that halting CPAP increases blood pressure between 5.0 and 9.0 mm Hg, compared with patients who continued using CPAP (Chest. 2016;150[6]:1202-10).
To get around the problem of adherence, researchers led by Malcolm Kohler, MD, at University Hospital of Zürich analyzed the results of three previous studies looking at the effects of CPAP withdrawal. The analysis included 153 OSA patients on CPAP therapy, who had been randomized to continue therapy or to withdraw from therapy for 2 weeks. Eighty-seven of these patients discontinued CPAP, and the remaining 66 patients continued the therapy. Blood pressure was measured at home and in hospital.
On average, those who discontinued CPAP had an increase in office systolic blood pressure of 5.4 mm Hg (95% confidence interval, 1.8-8.9 mm Hg; P = .003) and an increase in home systolic blood pressure of 9.0 mm Hg (95% CI, 5.7-12.3 mm Hg; P less than .001), compared with patients who continued CPAP. The effects of stopping CPAP, instead of continuing the therapy, on office diastolic blood pressure and home diastolic pressure were increases of 5.0 mm Hg (95% CI, 2.7-7.3 mm Hg; P less than .001) and 7.8 mm Hg (95% CI, 5.6-10.0 mm Hg; P less than .001), respectively.
Patients who discontinued CPAP also experienced a significant increase in apnea-hypopnea index, from 2.8/h to 33.2/h, while those who continued using CPAP, on average, experienced only a 0.3/h increase in apnea-hypopnea index from baseline.
“One clinical implication is that if you do not need to stop CPAP for obstructive sleep apnea, do not stop it. This study also suggests the importance of monitoring your blood pressure in a home setting, under usual conditions,” summed up Robert Kloner, MD, PhD, director of the Huntington Medical Research Institutes Cardiovascular Research Lab, Pasadena, Calif., who was not involved in the study.
Previous studies of CPAP, such as the SAVE study published in the New England Journal of Medicine in September (N Engl J Med. 2016;375:919-31), often find little or no connection between CPAP therapy and cardiovascular outcomes. That is probably because of inadequate adherence to CPAP therapy. “That’s always been the bane of sleep apnea studies,” said Krishna M. Sundar, MD, FCCP, who also did not participate in the study.
The current work got around the problem by looking at patients who had already established use of CPAP. “This is a very good study,” said Dr. Sundar, who is the medical director of the Sleep-Wake Center at the University of Utah, Salt Lake City.
The study was funded by the Swiss National Science Foundation and the University of Zürich. The analysis’ authors and the outside experts quoted in this story reported no financial disclosures.
Continuous positive airway pressure (CPAP) therapy for obstructive sleep apnea (OSA) has a significant beneficial effect on blood pressure, according to an analysis of participants in three randomized controlled trials.
Previous meta-analyses suggested that CPAP treatment led to an average of improvement of 2-3 mm Hg, but the estimates relied on heterogeneous trials that often had low levels of CPAP adherence, and those factors might have led to an underestimation of the treatment effect. The new analysis showed that halting CPAP increases blood pressure between 5.0 and 9.0 mm Hg, compared with patients who continued using CPAP (Chest. 2016;150[6]:1202-10).
To get around the problem of adherence, researchers led by Malcolm Kohler, MD, at University Hospital of Zürich analyzed the results of three previous studies looking at the effects of CPAP withdrawal. The analysis included 153 OSA patients on CPAP therapy, who had been randomized to continue therapy or to withdraw from therapy for 2 weeks. Eighty-seven of these patients discontinued CPAP, and the remaining 66 patients continued the therapy. Blood pressure was measured at home and in hospital.
On average, those who discontinued CPAP had an increase in office systolic blood pressure of 5.4 mm Hg (95% confidence interval, 1.8-8.9 mm Hg; P = .003) and an increase in home systolic blood pressure of 9.0 mm Hg (95% CI, 5.7-12.3 mm Hg; P less than .001), compared with patients who continued CPAP. The effects of stopping CPAP, instead of continuing the therapy, on office diastolic blood pressure and home diastolic pressure were increases of 5.0 mm Hg (95% CI, 2.7-7.3 mm Hg; P less than .001) and 7.8 mm Hg (95% CI, 5.6-10.0 mm Hg; P less than .001), respectively.
Patients who discontinued CPAP also experienced a significant increase in apnea-hypopnea index, from 2.8/h to 33.2/h, while those who continued using CPAP, on average, experienced only a 0.3/h increase in apnea-hypopnea index from baseline.
“One clinical implication is that if you do not need to stop CPAP for obstructive sleep apnea, do not stop it. This study also suggests the importance of monitoring your blood pressure in a home setting, under usual conditions,” summed up Robert Kloner, MD, PhD, director of the Huntington Medical Research Institutes Cardiovascular Research Lab, Pasadena, Calif., who was not involved in the study.
Previous studies of CPAP, such as the SAVE study published in the New England Journal of Medicine in September (N Engl J Med. 2016;375:919-31), often find little or no connection between CPAP therapy and cardiovascular outcomes. That is probably because of inadequate adherence to CPAP therapy. “That’s always been the bane of sleep apnea studies,” said Krishna M. Sundar, MD, FCCP, who also did not participate in the study.
The current work got around the problem by looking at patients who had already established use of CPAP. “This is a very good study,” said Dr. Sundar, who is the medical director of the Sleep-Wake Center at the University of Utah, Salt Lake City.
The study was funded by the Swiss National Science Foundation and the University of Zürich. The analysis’ authors and the outside experts quoted in this story reported no financial disclosures.
FROM CHEST
Key clinical point: Interrupting CPAP therapy leads to a rise in blood pressure.
Major finding: Stopping CPAP was associated with 5.0-9.0 mm Hg blood pressure increase.
Data source: Analysis of 153 patients with moderate to severe OSA, who had participated in three randomized controlled trials.
Disclosures: The study was funded by the Swiss National Science Foundation and the University of Zürich. The authors of the analysis and the outside experts quoted in this story reported no financial disclosures.
Dr. Sundar and Dr. Kloner reported having no financial disclosures.
Genetics dictate interferon-alfa diarrhea risk
Genetics are probably to blame for why some patients have significant intestinal side effects from Roferon-A (interferon alfa-2a, recombinant – Roche) while others do not, according to a new French investigation reported in issue of Cellular and Molecular Gastroenterology and Hepatology.
They cultured intestinal wall samples taken from 20 colon cancer patients when they had surgery; none of them had been exposed to chemotherapy, radiation, or immunosuppressives. The team bathed the cells in IFN-alfa 2a and other biochemicals to see how they reacted. It was bench work, but the findings could eventually be useful if the team identifies the genetic risk factors for Roferon intestinal side effects and finds better options for patients at risk. Roferon is widely used for blood cancer, melanoma, viral hepatitis, and renal and hepatocellular carcinomas.
“IFN-alfa 2a elicited a rapid (24 hours) disruption of surface and crypt colonic epithelial cells via apoptosis that was variable in intensity among the 20 individuals studied. This apoptotic effect was dependent on the initiation of an IFN-gamma response … expressed in T cell–positive lamina propria cells,” the investigators said.
“IFN-alfa impairs human intestinal mucosa homeostasis by eliciting epithelial barrier disruption via apoptosis … The IFN-alfa–elicited impairment of intestinal mucosa homeostasis is heterogeneous among individuals,” Dr. Jarry and her associates wrote.
“This ex vivo finding parallels clinical observations of the interpatient variability of Roferon therapy side effects. … It has been reported that approximately 60% of patients with chronic hepatitis or cancer treated with Roferon have intestinal disorders, especially diarrhea,” they said.
The authors had no conflicts of interest. The work was funded by the University of Nantes.
Many studies have implicated cytokines in various human GI-tract disorders, but there is still limited information in the literature about their exact role in the maintenance and disturbance of tissue homeostasis and the molecular mechanisms involved.
Interestingly, IFN-alfa did not induce apoptosis in all human colonic fragments analyzed, showing that their culture model accounts for variability among individuals, recapitulating the heterogeneous response of cancer patients to IFN-alfa-based treatment who present with intestinal dysfunction as a side effect. The use of such models of human primary cell culture helps us develop a better understanding of how cytokines affect the GI mucosa, potentially leading to alternative targets for treatments. They also could be used to determine the individual patient differences underlying diverse responses to cytokines and drugs, which is particularly important to the advance of precision/personalized medicine.
Jason C. Mills, MD, PhD, and Luciana H. Osaki, PhD, are with the division of gastroenterology, departments of medicine, pathology and immunology, and developmental biology, Washington University, St. Louis. They have no conflicts of interest.
Many studies have implicated cytokines in various human GI-tract disorders, but there is still limited information in the literature about their exact role in the maintenance and disturbance of tissue homeostasis and the molecular mechanisms involved.
Interestingly, IFN-alfa did not induce apoptosis in all human colonic fragments analyzed, showing that their culture model accounts for variability among individuals, recapitulating the heterogeneous response of cancer patients to IFN-alfa-based treatment who present with intestinal dysfunction as a side effect. The use of such models of human primary cell culture helps us develop a better understanding of how cytokines affect the GI mucosa, potentially leading to alternative targets for treatments. They also could be used to determine the individual patient differences underlying diverse responses to cytokines and drugs, which is particularly important to the advance of precision/personalized medicine.
Jason C. Mills, MD, PhD, and Luciana H. Osaki, PhD, are with the division of gastroenterology, departments of medicine, pathology and immunology, and developmental biology, Washington University, St. Louis. They have no conflicts of interest.
Many studies have implicated cytokines in various human GI-tract disorders, but there is still limited information in the literature about their exact role in the maintenance and disturbance of tissue homeostasis and the molecular mechanisms involved.
Interestingly, IFN-alfa did not induce apoptosis in all human colonic fragments analyzed, showing that their culture model accounts for variability among individuals, recapitulating the heterogeneous response of cancer patients to IFN-alfa-based treatment who present with intestinal dysfunction as a side effect. The use of such models of human primary cell culture helps us develop a better understanding of how cytokines affect the GI mucosa, potentially leading to alternative targets for treatments. They also could be used to determine the individual patient differences underlying diverse responses to cytokines and drugs, which is particularly important to the advance of precision/personalized medicine.
Jason C. Mills, MD, PhD, and Luciana H. Osaki, PhD, are with the division of gastroenterology, departments of medicine, pathology and immunology, and developmental biology, Washington University, St. Louis. They have no conflicts of interest.
Genetics are probably to blame for why some patients have significant intestinal side effects from Roferon-A (interferon alfa-2a, recombinant – Roche) while others do not, according to a new French investigation reported in issue of Cellular and Molecular Gastroenterology and Hepatology.
They cultured intestinal wall samples taken from 20 colon cancer patients when they had surgery; none of them had been exposed to chemotherapy, radiation, or immunosuppressives. The team bathed the cells in IFN-alfa 2a and other biochemicals to see how they reacted. It was bench work, but the findings could eventually be useful if the team identifies the genetic risk factors for Roferon intestinal side effects and finds better options for patients at risk. Roferon is widely used for blood cancer, melanoma, viral hepatitis, and renal and hepatocellular carcinomas.
“IFN-alfa 2a elicited a rapid (24 hours) disruption of surface and crypt colonic epithelial cells via apoptosis that was variable in intensity among the 20 individuals studied. This apoptotic effect was dependent on the initiation of an IFN-gamma response … expressed in T cell–positive lamina propria cells,” the investigators said.
“IFN-alfa impairs human intestinal mucosa homeostasis by eliciting epithelial barrier disruption via apoptosis … The IFN-alfa–elicited impairment of intestinal mucosa homeostasis is heterogeneous among individuals,” Dr. Jarry and her associates wrote.
“This ex vivo finding parallels clinical observations of the interpatient variability of Roferon therapy side effects. … It has been reported that approximately 60% of patients with chronic hepatitis or cancer treated with Roferon have intestinal disorders, especially diarrhea,” they said.
The authors had no conflicts of interest. The work was funded by the University of Nantes.
Genetics are probably to blame for why some patients have significant intestinal side effects from Roferon-A (interferon alfa-2a, recombinant – Roche) while others do not, according to a new French investigation reported in issue of Cellular and Molecular Gastroenterology and Hepatology.
They cultured intestinal wall samples taken from 20 colon cancer patients when they had surgery; none of them had been exposed to chemotherapy, radiation, or immunosuppressives. The team bathed the cells in IFN-alfa 2a and other biochemicals to see how they reacted. It was bench work, but the findings could eventually be useful if the team identifies the genetic risk factors for Roferon intestinal side effects and finds better options for patients at risk. Roferon is widely used for blood cancer, melanoma, viral hepatitis, and renal and hepatocellular carcinomas.
“IFN-alfa 2a elicited a rapid (24 hours) disruption of surface and crypt colonic epithelial cells via apoptosis that was variable in intensity among the 20 individuals studied. This apoptotic effect was dependent on the initiation of an IFN-gamma response … expressed in T cell–positive lamina propria cells,” the investigators said.
“IFN-alfa impairs human intestinal mucosa homeostasis by eliciting epithelial barrier disruption via apoptosis … The IFN-alfa–elicited impairment of intestinal mucosa homeostasis is heterogeneous among individuals,” Dr. Jarry and her associates wrote.
“This ex vivo finding parallels clinical observations of the interpatient variability of Roferon therapy side effects. … It has been reported that approximately 60% of patients with chronic hepatitis or cancer treated with Roferon have intestinal disorders, especially diarrhea,” they said.
The authors had no conflicts of interest. The work was funded by the University of Nantes.
FROM CELLULAR AND MOLECULAR GASTROENTEROLOGY AND HEPATOLOGY
VIDEO: Point-of-care microsensor prototype beats conventional coagulopathy tests
SAN DIEGO – Using less than a drop of blood, a portable microsensor provided a comprehensive coagulation profile in minutes and perfectly distinguished various coagulopathies from normal blood samples – handily beating both activated partial thromboplastin time (aPTT) and prothrombin time (PT).
Dubbed ClotChip, the disposable device detects coagulation factors and platelet activity by using a technique called dielectric spectroscopy, Evi X. Stavrou, MD, of Case Western Reserve University, Cleveland, said in a video interview at the annual meeting of the American Society of Hematology. It points the way for comprehensive, rapid, point-of-care assessment of critically ill or severely injured patients and those who need ongoing monitoring to evaluate response to anticoagulant therapy, she added.
By plotting rates of true positives (patients with coagulopathies) against rates of true negatives (controls), the researchers obtained areas under the receiver operating curves of 100% for ClotChip, 78% for aPTT, and 57% for PT. In other words, ClotChip correctly identified all cases and controls in this small patient cohort, which neither aPTT or PT did.
Dr. Stavrou and her coinvestigators had no relevant financial disclosures.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
SAN DIEGO – Using less than a drop of blood, a portable microsensor provided a comprehensive coagulation profile in minutes and perfectly distinguished various coagulopathies from normal blood samples – handily beating both activated partial thromboplastin time (aPTT) and prothrombin time (PT).
Dubbed ClotChip, the disposable device detects coagulation factors and platelet activity by using a technique called dielectric spectroscopy, Evi X. Stavrou, MD, of Case Western Reserve University, Cleveland, said in a video interview at the annual meeting of the American Society of Hematology. It points the way for comprehensive, rapid, point-of-care assessment of critically ill or severely injured patients and those who need ongoing monitoring to evaluate response to anticoagulant therapy, she added.
By plotting rates of true positives (patients with coagulopathies) against rates of true negatives (controls), the researchers obtained areas under the receiver operating curves of 100% for ClotChip, 78% for aPTT, and 57% for PT. In other words, ClotChip correctly identified all cases and controls in this small patient cohort, which neither aPTT or PT did.
Dr. Stavrou and her coinvestigators had no relevant financial disclosures.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
SAN DIEGO – Using less than a drop of blood, a portable microsensor provided a comprehensive coagulation profile in minutes and perfectly distinguished various coagulopathies from normal blood samples – handily beating both activated partial thromboplastin time (aPTT) and prothrombin time (PT).
Dubbed ClotChip, the disposable device detects coagulation factors and platelet activity by using a technique called dielectric spectroscopy, Evi X. Stavrou, MD, of Case Western Reserve University, Cleveland, said in a video interview at the annual meeting of the American Society of Hematology. It points the way for comprehensive, rapid, point-of-care assessment of critically ill or severely injured patients and those who need ongoing monitoring to evaluate response to anticoagulant therapy, she added.
By plotting rates of true positives (patients with coagulopathies) against rates of true negatives (controls), the researchers obtained areas under the receiver operating curves of 100% for ClotChip, 78% for aPTT, and 57% for PT. In other words, ClotChip correctly identified all cases and controls in this small patient cohort, which neither aPTT or PT did.
Dr. Stavrou and her coinvestigators had no relevant financial disclosures.
The video associated with this article is no longer available on this site. Please view all of our videos on the MDedge YouTube channel
AT ASH 2016
Brentuximab vedotin beat methotrexate, bexarotene in cutaneous T-cell lymphoma
SAN DIEGO – For patients with CD30 expressing cutaneous T-cell lymphoma, antibody-drug conjugate therapy with brentuximab vedotin significantly outperformed two standard regimens in the phase III ALCANZA trial.
After a median of 17.5 months of follow-up, 56% of patients receiving brentuximab vedotin had an objective response lasting at least 4 months, versus 13% of patients treated with physician’s choice of methotrexate or bexarotene (P less than .0001), Youn H. Kim, MD, said during an oral presentation at the annual meeting of the American Society of Hematology.
As in past studies, brentuximab vedotin caused high rates of peripheral neuropathy, but more than 80% of cases improved or resolved over time, she said.
This is the first reported phase III trial to convincingly show that a new systemic agent outperformed standard therapies for cutaneous T-cell lymphoma (CTCL), which tend to have inadequate and short-lived efficacy, stated Dr. Kim, of Stanford (Calif.) University. Brentuximab vedotin not only met the primary endpoint, but all other predefined endpoints, including progression-free survival and a quality-of-life measure, she said.
“These compelling results have potential practice-changing implications,” she concluded.
Brentuximab vedotin (Adcetris) targets CD30, which is expressed in skin lesions of about half of patients with CTCL. A protease-cleavable linker attaches an anti-CD30 monoclonal antibody to monomethyl auristatin E, which disrupts microtubules when released into CD30-positive tumor cells (Blood. 2013;122:367). The agent showed clinical activity against CTCL in two previous phase II trials of CTCL.
Accordingly, the international, open-label phase III ALCANZA study enrolled 128 treatment-experienced patients with CD30-expressing mycosis fungoides or primary cutaneous anaplastic large cell lymphoma. Patients were randomly assigned to receive brentuximab vedotin (1.8 mg/kg once every 3 weeks) or physician’s choice of either methotrexate (5 to 50 mg once weekly) or bexarotene (300 mg/m2 once daily) for up to 16 3-week cycles, or until disease progression or unacceptable toxicity. Methotrexate or bexarotene were designated “physician’s choice” because they are used worldwide for treating CTCL, according to Dr. Kim.
To capture both the rate and duration of response, researchers defined objective response lasting at least 4 months as the primary endpoint. Brentuximab vedotin more than quadrupled the likelihood of this outcome when compared with the standard CTCL regimens, a trend that spanned key demographic and clinical subgroups, Dr. Kim said.
“All endpoints were highly [statistically] significant,” she further reported. For example, the objective response rate with brentuximab vedotin was 67%, versus 20% for methotrexate or bexarotene. Respective rates of complete response were 16% and 2%, and median durations of progression-free survival were 17 and 4 months, translating to a 73% lower risk of progression or death with brentuximab vedotin (95% confidence interval, 57%-83%). Patients who received brentuximab vedotin also reported about a three-fold greater improvement on the Skindex-29 symptom domain, compared with the physician’s choice group (–29 vs. –9 points; P less than .0001).
The safety profile of brentuximab vedotin resembled that seen in previous studies, Dr. Kim said. Most notably, 67% of patients developed peripheral neuropathy, and 9% developed grade 3 peripheral neuropathy. This usually improved or resolved over about the next 22 months. Diarrhea, fatigue, and vomiting affected about a third of patients on brentuximab vedotin, and about one in four stopped treatment because of adverse events, compared with 8% of the physician’s choice arm. Rates of serious adverse events were 41% and 47%, respectively. One brentuximab vedotin recipient died of multiple organ dysfunction syndrome that investigators attributed to treatment-associated necrosis of peripheral tumors. They identified no other treatment-related deaths.
Seattle Genetics and Takeda funded the trial. Dr. Kim disclosed ties to Takeda and Seattle Genetics, as well as several other pharmaceutical companies.
SAN DIEGO – For patients with CD30 expressing cutaneous T-cell lymphoma, antibody-drug conjugate therapy with brentuximab vedotin significantly outperformed two standard regimens in the phase III ALCANZA trial.
After a median of 17.5 months of follow-up, 56% of patients receiving brentuximab vedotin had an objective response lasting at least 4 months, versus 13% of patients treated with physician’s choice of methotrexate or bexarotene (P less than .0001), Youn H. Kim, MD, said during an oral presentation at the annual meeting of the American Society of Hematology.
As in past studies, brentuximab vedotin caused high rates of peripheral neuropathy, but more than 80% of cases improved or resolved over time, she said.
This is the first reported phase III trial to convincingly show that a new systemic agent outperformed standard therapies for cutaneous T-cell lymphoma (CTCL), which tend to have inadequate and short-lived efficacy, stated Dr. Kim, of Stanford (Calif.) University. Brentuximab vedotin not only met the primary endpoint, but all other predefined endpoints, including progression-free survival and a quality-of-life measure, she said.
“These compelling results have potential practice-changing implications,” she concluded.
Brentuximab vedotin (Adcetris) targets CD30, which is expressed in skin lesions of about half of patients with CTCL. A protease-cleavable linker attaches an anti-CD30 monoclonal antibody to monomethyl auristatin E, which disrupts microtubules when released into CD30-positive tumor cells (Blood. 2013;122:367). The agent showed clinical activity against CTCL in two previous phase II trials of CTCL.
Accordingly, the international, open-label phase III ALCANZA study enrolled 128 treatment-experienced patients with CD30-expressing mycosis fungoides or primary cutaneous anaplastic large cell lymphoma. Patients were randomly assigned to receive brentuximab vedotin (1.8 mg/kg once every 3 weeks) or physician’s choice of either methotrexate (5 to 50 mg once weekly) or bexarotene (300 mg/m2 once daily) for up to 16 3-week cycles, or until disease progression or unacceptable toxicity. Methotrexate or bexarotene were designated “physician’s choice” because they are used worldwide for treating CTCL, according to Dr. Kim.
To capture both the rate and duration of response, researchers defined objective response lasting at least 4 months as the primary endpoint. Brentuximab vedotin more than quadrupled the likelihood of this outcome when compared with the standard CTCL regimens, a trend that spanned key demographic and clinical subgroups, Dr. Kim said.
“All endpoints were highly [statistically] significant,” she further reported. For example, the objective response rate with brentuximab vedotin was 67%, versus 20% for methotrexate or bexarotene. Respective rates of complete response were 16% and 2%, and median durations of progression-free survival were 17 and 4 months, translating to a 73% lower risk of progression or death with brentuximab vedotin (95% confidence interval, 57%-83%). Patients who received brentuximab vedotin also reported about a three-fold greater improvement on the Skindex-29 symptom domain, compared with the physician’s choice group (–29 vs. –9 points; P less than .0001).
The safety profile of brentuximab vedotin resembled that seen in previous studies, Dr. Kim said. Most notably, 67% of patients developed peripheral neuropathy, and 9% developed grade 3 peripheral neuropathy. This usually improved or resolved over about the next 22 months. Diarrhea, fatigue, and vomiting affected about a third of patients on brentuximab vedotin, and about one in four stopped treatment because of adverse events, compared with 8% of the physician’s choice arm. Rates of serious adverse events were 41% and 47%, respectively. One brentuximab vedotin recipient died of multiple organ dysfunction syndrome that investigators attributed to treatment-associated necrosis of peripheral tumors. They identified no other treatment-related deaths.
Seattle Genetics and Takeda funded the trial. Dr. Kim disclosed ties to Takeda and Seattle Genetics, as well as several other pharmaceutical companies.
SAN DIEGO – For patients with CD30 expressing cutaneous T-cell lymphoma, antibody-drug conjugate therapy with brentuximab vedotin significantly outperformed two standard regimens in the phase III ALCANZA trial.
After a median of 17.5 months of follow-up, 56% of patients receiving brentuximab vedotin had an objective response lasting at least 4 months, versus 13% of patients treated with physician’s choice of methotrexate or bexarotene (P less than .0001), Youn H. Kim, MD, said during an oral presentation at the annual meeting of the American Society of Hematology.
As in past studies, brentuximab vedotin caused high rates of peripheral neuropathy, but more than 80% of cases improved or resolved over time, she said.
This is the first reported phase III trial to convincingly show that a new systemic agent outperformed standard therapies for cutaneous T-cell lymphoma (CTCL), which tend to have inadequate and short-lived efficacy, stated Dr. Kim, of Stanford (Calif.) University. Brentuximab vedotin not only met the primary endpoint, but all other predefined endpoints, including progression-free survival and a quality-of-life measure, she said.
“These compelling results have potential practice-changing implications,” she concluded.
Brentuximab vedotin (Adcetris) targets CD30, which is expressed in skin lesions of about half of patients with CTCL. A protease-cleavable linker attaches an anti-CD30 monoclonal antibody to monomethyl auristatin E, which disrupts microtubules when released into CD30-positive tumor cells (Blood. 2013;122:367). The agent showed clinical activity against CTCL in two previous phase II trials of CTCL.
Accordingly, the international, open-label phase III ALCANZA study enrolled 128 treatment-experienced patients with CD30-expressing mycosis fungoides or primary cutaneous anaplastic large cell lymphoma. Patients were randomly assigned to receive brentuximab vedotin (1.8 mg/kg once every 3 weeks) or physician’s choice of either methotrexate (5 to 50 mg once weekly) or bexarotene (300 mg/m2 once daily) for up to 16 3-week cycles, or until disease progression or unacceptable toxicity. Methotrexate or bexarotene were designated “physician’s choice” because they are used worldwide for treating CTCL, according to Dr. Kim.
To capture both the rate and duration of response, researchers defined objective response lasting at least 4 months as the primary endpoint. Brentuximab vedotin more than quadrupled the likelihood of this outcome when compared with the standard CTCL regimens, a trend that spanned key demographic and clinical subgroups, Dr. Kim said.
“All endpoints were highly [statistically] significant,” she further reported. For example, the objective response rate with brentuximab vedotin was 67%, versus 20% for methotrexate or bexarotene. Respective rates of complete response were 16% and 2%, and median durations of progression-free survival were 17 and 4 months, translating to a 73% lower risk of progression or death with brentuximab vedotin (95% confidence interval, 57%-83%). Patients who received brentuximab vedotin also reported about a three-fold greater improvement on the Skindex-29 symptom domain, compared with the physician’s choice group (–29 vs. –9 points; P less than .0001).
The safety profile of brentuximab vedotin resembled that seen in previous studies, Dr. Kim said. Most notably, 67% of patients developed peripheral neuropathy, and 9% developed grade 3 peripheral neuropathy. This usually improved or resolved over about the next 22 months. Diarrhea, fatigue, and vomiting affected about a third of patients on brentuximab vedotin, and about one in four stopped treatment because of adverse events, compared with 8% of the physician’s choice arm. Rates of serious adverse events were 41% and 47%, respectively. One brentuximab vedotin recipient died of multiple organ dysfunction syndrome that investigators attributed to treatment-associated necrosis of peripheral tumors. They identified no other treatment-related deaths.
Seattle Genetics and Takeda funded the trial. Dr. Kim disclosed ties to Takeda and Seattle Genetics, as well as several other pharmaceutical companies.
AT ASH 2016
Key clinical point: Brentuximab vedotin met all its endpoints but often caused peripheral neuropathy in a phase III trial of patients with CD30 expressing cutaneous T-cell lymphoma.
Major finding: After a median of 17.5 months of follow-up, 56% of patients receiving brentuximab vedotin had an objective response lasting at least 4 months, versus 13% of those receiving physician’s choice of methotrexate or bexarotene (P less than .0001).
Data source: A multicenter, open-label phase III trial of 128 patients with CD30-expressing mycosis fungoides or primary cutaneous anaplastic large cell lymphoma.
Disclosures: Seattle Genetics and Takeda funded the trial. Dr. Kim disclosed ties to Seattle Genetics and Takeda, as well as several other pharmaceutical companies.