Creating a digital pill

Article Type
Changed
Fri, 09/14/2018 - 11:52

Technology battles medication noncompliance

 

Hospitalists and other physicians have long struggled with medication noncompliance, which can lead to sicker patients and higher rates of readmittance, and costs some $100-$289 billion a year.

There is a growing field of digital devices being developed to address this problem. The Food and Drug Administration has just approved the newest one: a medication with a sensor embedded that can tell doctors if, and when, patients take their medicine, according to an article in the New York Times.1 It’s expected to become available in 2018.


The digital medication is a version of the antipsychotic Abilify. Patients who agree to take it will sign consent forms allowing their doctors (and up to four other people) to receive electronic data showing the date and time pills are ingested.

The sensor, created by Proteus Digital Health, contains copper, magnesium, and silicon, all said to be safe ingredients found in foods. The electrical signal is created when stomach fluids contact the sensor; a patch worn on the rib cage detects that signal and sends the message.

Other companies are joining the race to create digital medication technologies; these are being tested in medications for patients with conditions including heart disease, diabetes, and HIV infection. Some researchers predict the technology might have applications for monitoring the opioid intake of postsurgical patients or patients in medication clinical trials.

Reference

1. Belluck P. “First Digital Pill Approved to Worries About Biomedical ‘Big Brother.’ ” New York Times. Nov 13, 2017.

Publications
Topics
Sections

Technology battles medication noncompliance

Technology battles medication noncompliance

 

Hospitalists and other physicians have long struggled with medication noncompliance, which can lead to sicker patients and higher rates of readmittance, and costs some $100-$289 billion a year.

There is a growing field of digital devices being developed to address this problem. The Food and Drug Administration has just approved the newest one: a medication with a sensor embedded that can tell doctors if, and when, patients take their medicine, according to an article in the New York Times.1 It’s expected to become available in 2018.


The digital medication is a version of the antipsychotic Abilify. Patients who agree to take it will sign consent forms allowing their doctors (and up to four other people) to receive electronic data showing the date and time pills are ingested.

The sensor, created by Proteus Digital Health, contains copper, magnesium, and silicon, all said to be safe ingredients found in foods. The electrical signal is created when stomach fluids contact the sensor; a patch worn on the rib cage detects that signal and sends the message.

Other companies are joining the race to create digital medication technologies; these are being tested in medications for patients with conditions including heart disease, diabetes, and HIV infection. Some researchers predict the technology might have applications for monitoring the opioid intake of postsurgical patients or patients in medication clinical trials.

Reference

1. Belluck P. “First Digital Pill Approved to Worries About Biomedical ‘Big Brother.’ ” New York Times. Nov 13, 2017.

 

Hospitalists and other physicians have long struggled with medication noncompliance, which can lead to sicker patients and higher rates of readmittance, and costs some $100-$289 billion a year.

There is a growing field of digital devices being developed to address this problem. The Food and Drug Administration has just approved the newest one: a medication with a sensor embedded that can tell doctors if, and when, patients take their medicine, according to an article in the New York Times.1 It’s expected to become available in 2018.


The digital medication is a version of the antipsychotic Abilify. Patients who agree to take it will sign consent forms allowing their doctors (and up to four other people) to receive electronic data showing the date and time pills are ingested.

The sensor, created by Proteus Digital Health, contains copper, magnesium, and silicon, all said to be safe ingredients found in foods. The electrical signal is created when stomach fluids contact the sensor; a patch worn on the rib cage detects that signal and sends the message.

Other companies are joining the race to create digital medication technologies; these are being tested in medications for patients with conditions including heart disease, diabetes, and HIV infection. Some researchers predict the technology might have applications for monitoring the opioid intake of postsurgical patients or patients in medication clinical trials.

Reference

1. Belluck P. “First Digital Pill Approved to Worries About Biomedical ‘Big Brother.’ ” New York Times. Nov 13, 2017.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Serious complications linked to rituximab in MS

Article Type
Changed
Mon, 01/07/2019 - 13:13

 

– In a sign of the potential complications that can be spawned by B-cell–depleting therapies, a new report found that 5 of 30 patients with relapsing-remitting multiple sclerosis (RRMS) had to discontinue or interrupt long-term treatment with rituximab (Rituxan) because of serious infections such as pneumonia, septic arthritis, and sinusitis.

The findings are a “big lesson to not just focus on opportunistic infections [with Rituxan use] but also consider nonopportunistic infections that could occur,” lead study author Cindy Darius, a registered nurse with the Johns Hopkins Multiple Sclerosis Center (JHMSC), Baltimore, said in an interview. She presented the research at the annual meeting of the Consortium of Multiple Sclerosis Centers.

copyright Zerbor/Thinkstock
The Food and Drug Administration has approved rituximab for lymphoma, leukemia, RA, and some rare conditions, but its use in MS is off label. Earlier this year, however, a Swedish study in JAMA Neurology found that the drug “performs better than other commonly used DMTs [disease-modifying therapies] in patients with newly diagnosed RRMS” (JAMA Neurol. 2018;75[3]:320-7).

As Ms. Darius noted, progressive multifocal leukoencephalopathy has been the main focus of discussions about the use of rituximab in MS, as the disease has been noted in patients who have taken rituximab for other conditions.

But Ms. Darius said that the JHMSC observed a trend of patients with MS who took rituximab and developed “these weird infections that were more nonopportunistic infections. That prompted us to dig a little bit deeper: Are these infections happening sporadically, or could they have a connection with Rituxan?”

Ms. Darius and her colleagues retrospectively reviewed the records of 30 patients with MS who were prescribed rituximab by a single JHMSC physician since 2012. They found five cases of infectious complications, all in patients with RRMS:

  • A woman, aged 30 years, whose rituximab regimen was interrupted after 4 years of treatment when she developed recurrent pneumonia.
  • A man, aged 42 years, who took rituximab for a year then stopped after developing ringworm and two bouts of Staphylococcus aureus septic arthritis, and who had previously changed from natalizumab (Tysabri) to rituximab after seroconverting to the John Cunningham virus.
  • A woman, aged 65 years, with Sjögren’s syndrome who stopped rituximab at 2 years after developing sinusitis, pneumonia, and herpes simplex virus keratitis.
  • A woman, aged 38 years, who discontinued rituximab after 2 years because of recurrent urosepsis, sinusitis, and pyrexia of unknown origin.
  • A woman, aged 56 years, who stopped rituximab after 2 years following intractable sinusitis and pneumonia that resulted in empyema and required a thoracotomy.

What might be causing the apparent side effects? Ms. Darius pointed out that the patients were already immunocompromised because of previous treatment with first- and/or second-line medications. She added that the complications “may be due to dosing that may be a little too high for the MS population.”

JHMSC is considering whether to give doses of the drug once a year instead of twice annually, she said. “Other providers are cutting the dose in half: Instead of 1,000 mg, they’re giving 500,” she added. “After the patient has been on the medication for a year or two, and you feel the disease process has stabilized, you may want to consider adjusting the dosage.”

Going forward, the researchers wrote that they “plan to determine the incidence of all serious infectious complications related to rituximab use among MS patients attending the JHMSC, and the influence of different dosing protocols between MS providers in this regard.”

No study funding was reported, and most study authors reported no relevant disclosures. One author reported receiving National Institutes of Health funding and another reported consulting for Biogen and Genentech.

SOURCE: Darius C et al. CMSC 2018, Abstract DX57.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event
Related Articles

 

– In a sign of the potential complications that can be spawned by B-cell–depleting therapies, a new report found that 5 of 30 patients with relapsing-remitting multiple sclerosis (RRMS) had to discontinue or interrupt long-term treatment with rituximab (Rituxan) because of serious infections such as pneumonia, septic arthritis, and sinusitis.

The findings are a “big lesson to not just focus on opportunistic infections [with Rituxan use] but also consider nonopportunistic infections that could occur,” lead study author Cindy Darius, a registered nurse with the Johns Hopkins Multiple Sclerosis Center (JHMSC), Baltimore, said in an interview. She presented the research at the annual meeting of the Consortium of Multiple Sclerosis Centers.

copyright Zerbor/Thinkstock
The Food and Drug Administration has approved rituximab for lymphoma, leukemia, RA, and some rare conditions, but its use in MS is off label. Earlier this year, however, a Swedish study in JAMA Neurology found that the drug “performs better than other commonly used DMTs [disease-modifying therapies] in patients with newly diagnosed RRMS” (JAMA Neurol. 2018;75[3]:320-7).

As Ms. Darius noted, progressive multifocal leukoencephalopathy has been the main focus of discussions about the use of rituximab in MS, as the disease has been noted in patients who have taken rituximab for other conditions.

But Ms. Darius said that the JHMSC observed a trend of patients with MS who took rituximab and developed “these weird infections that were more nonopportunistic infections. That prompted us to dig a little bit deeper: Are these infections happening sporadically, or could they have a connection with Rituxan?”

Ms. Darius and her colleagues retrospectively reviewed the records of 30 patients with MS who were prescribed rituximab by a single JHMSC physician since 2012. They found five cases of infectious complications, all in patients with RRMS:

  • A woman, aged 30 years, whose rituximab regimen was interrupted after 4 years of treatment when she developed recurrent pneumonia.
  • A man, aged 42 years, who took rituximab for a year then stopped after developing ringworm and two bouts of Staphylococcus aureus septic arthritis, and who had previously changed from natalizumab (Tysabri) to rituximab after seroconverting to the John Cunningham virus.
  • A woman, aged 65 years, with Sjögren’s syndrome who stopped rituximab at 2 years after developing sinusitis, pneumonia, and herpes simplex virus keratitis.
  • A woman, aged 38 years, who discontinued rituximab after 2 years because of recurrent urosepsis, sinusitis, and pyrexia of unknown origin.
  • A woman, aged 56 years, who stopped rituximab after 2 years following intractable sinusitis and pneumonia that resulted in empyema and required a thoracotomy.

What might be causing the apparent side effects? Ms. Darius pointed out that the patients were already immunocompromised because of previous treatment with first- and/or second-line medications. She added that the complications “may be due to dosing that may be a little too high for the MS population.”

JHMSC is considering whether to give doses of the drug once a year instead of twice annually, she said. “Other providers are cutting the dose in half: Instead of 1,000 mg, they’re giving 500,” she added. “After the patient has been on the medication for a year or two, and you feel the disease process has stabilized, you may want to consider adjusting the dosage.”

Going forward, the researchers wrote that they “plan to determine the incidence of all serious infectious complications related to rituximab use among MS patients attending the JHMSC, and the influence of different dosing protocols between MS providers in this regard.”

No study funding was reported, and most study authors reported no relevant disclosures. One author reported receiving National Institutes of Health funding and another reported consulting for Biogen and Genentech.

SOURCE: Darius C et al. CMSC 2018, Abstract DX57.

 

– In a sign of the potential complications that can be spawned by B-cell–depleting therapies, a new report found that 5 of 30 patients with relapsing-remitting multiple sclerosis (RRMS) had to discontinue or interrupt long-term treatment with rituximab (Rituxan) because of serious infections such as pneumonia, septic arthritis, and sinusitis.

The findings are a “big lesson to not just focus on opportunistic infections [with Rituxan use] but also consider nonopportunistic infections that could occur,” lead study author Cindy Darius, a registered nurse with the Johns Hopkins Multiple Sclerosis Center (JHMSC), Baltimore, said in an interview. She presented the research at the annual meeting of the Consortium of Multiple Sclerosis Centers.

copyright Zerbor/Thinkstock
The Food and Drug Administration has approved rituximab for lymphoma, leukemia, RA, and some rare conditions, but its use in MS is off label. Earlier this year, however, a Swedish study in JAMA Neurology found that the drug “performs better than other commonly used DMTs [disease-modifying therapies] in patients with newly diagnosed RRMS” (JAMA Neurol. 2018;75[3]:320-7).

As Ms. Darius noted, progressive multifocal leukoencephalopathy has been the main focus of discussions about the use of rituximab in MS, as the disease has been noted in patients who have taken rituximab for other conditions.

But Ms. Darius said that the JHMSC observed a trend of patients with MS who took rituximab and developed “these weird infections that were more nonopportunistic infections. That prompted us to dig a little bit deeper: Are these infections happening sporadically, or could they have a connection with Rituxan?”

Ms. Darius and her colleagues retrospectively reviewed the records of 30 patients with MS who were prescribed rituximab by a single JHMSC physician since 2012. They found five cases of infectious complications, all in patients with RRMS:

  • A woman, aged 30 years, whose rituximab regimen was interrupted after 4 years of treatment when she developed recurrent pneumonia.
  • A man, aged 42 years, who took rituximab for a year then stopped after developing ringworm and two bouts of Staphylococcus aureus septic arthritis, and who had previously changed from natalizumab (Tysabri) to rituximab after seroconverting to the John Cunningham virus.
  • A woman, aged 65 years, with Sjögren’s syndrome who stopped rituximab at 2 years after developing sinusitis, pneumonia, and herpes simplex virus keratitis.
  • A woman, aged 38 years, who discontinued rituximab after 2 years because of recurrent urosepsis, sinusitis, and pyrexia of unknown origin.
  • A woman, aged 56 years, who stopped rituximab after 2 years following intractable sinusitis and pneumonia that resulted in empyema and required a thoracotomy.

What might be causing the apparent side effects? Ms. Darius pointed out that the patients were already immunocompromised because of previous treatment with first- and/or second-line medications. She added that the complications “may be due to dosing that may be a little too high for the MS population.”

JHMSC is considering whether to give doses of the drug once a year instead of twice annually, she said. “Other providers are cutting the dose in half: Instead of 1,000 mg, they’re giving 500,” she added. “After the patient has been on the medication for a year or two, and you feel the disease process has stabilized, you may want to consider adjusting the dosage.”

Going forward, the researchers wrote that they “plan to determine the incidence of all serious infectious complications related to rituximab use among MS patients attending the JHMSC, and the influence of different dosing protocols between MS providers in this regard.”

No study funding was reported, and most study authors reported no relevant disclosures. One author reported receiving National Institutes of Health funding and another reported consulting for Biogen and Genentech.

SOURCE: Darius C et al. CMSC 2018, Abstract DX57.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

REPORTING FROM THE CMSC ANNUAL MEETING

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Much of the attention toward side effects in rituximab as an off-label treatment for multiple sclerosis has focused on progressive multifocal leukoencephalopathy, but other infections may affect this population over the long term.

Major finding: Of 30 patients treated with rituximab for MS, 5 developed infections that required suspension or cessation of the treatment.

Study details: A retrospective analysis of 30 patients with MS treated with rituximab since 2012.

Disclosures: No study funding was reported, and most study authors reported no relevant disclosures. One author reported receiving National Institutes of Health funding and another reported consulting for Biogen and Genentech.

Source: Darius C et al. CMSC 2018, Abstract DX57.

Disqus Comments
Default
Use ProPublica

Sickle cell disease exacts a heavy vocational toll

Article Type
Changed
Fri, 01/04/2019 - 10:27

 

– Three-quarters of patients with sickle cell disease (SCD) reported missing work in the last year because of disease symptoms, according to results from a single-center study.

While the direct costs of SCD are easy to measure, it’s harder to capture the indirect costs patients may incur from this chronic, progressive disease, which range from lost days at work to the downstream consequences of “presenteeism.”

“Indirect costs are related to things that have value, but it’s a little bit harder to apply an exact value to it,” said Nicholas Vendetti of Pfizer. But this is a critical piece for understanding SCD, he said. “The burden of illness is unknown without productivity costs.”

Mr. Vendetti and his collaborators attempted to capture the indirect costs of SCD, and reported the results of a single-site study at the annual meeting of the Foundation for Sickle Cell Disease Research.

They recruited patients from Virginia Commonwealth University’s adult sickle cell clinic and trained interviewers to conduct structured interviews using the Institute for Medical Technology Assessment Productivity Cost Questionnaire. The interviewers asked about absenteeism, lost work, unpaid work activity, and “presenteeism,” defined as days when participants were at work but experienced decreased work output because of disease symptoms.

In the end, the study enrolled 186 patients aged 18 and older, a figure that “really exceeded what we expected when we started the protocol,” Mr. Vendetti said. Most participants were between the ages of 20 and 60 years – the most productive working years.

About 58% of participants were female. Nearly half (46%) had the HbSS genotype, while 30% had the HbSC genotype. About half (52%) were high school graduates, and about a third had some college. There were no advanced degrees earned in the study population, and 11.5% had not finished high school.

Initial questions about educational status and employment status “highlighted a very interesting aspect of the disease: 43.8% reported that they were currently unable to work as part of their disease process,” Mr. Vendetti said. Just 28% were employed for wages, 3% were self-employed, and about 7% reported being homemakers. The remainder were out of work, were students, or were retired.

Three-quarters of patients reported missing work in the last year because of SCD symptoms. This group reported missing a mean 36.75 days yearly. Assuming the average Virginia hourly wage of $25.53 per hour, this comes to an average of $7,506 in lost wages each year, Mr. Vendetti said.

Presenteeism had a large impact as well. Nearly 73% of patients said they were bothered at work – either psychologically or physically – by their symptoms in the last 4 weeks, and 90% over the past year. These patients estimated they were affected for about 100 working days yearly.

When asked on a scale of 0-10 how much work they were able to get done on days when their SCD was affecting productivity, “most patients are falling into that middle range” of a score of 4-6, Mr. Vendetti said. “Most patients are moderately affected.

“It’s hard to apply a dollar value to that, but it’s easy to see how it could affect the trajectory of your career,” he added.

Another aspect of the indirect cost of the sickle cell disease burden that’s even harder to tease out is whether those affected are unable to complete a significant amount of unpaid work. Again, about three-quarters of patients reported that SCD had affected their ability to do this kind of work, and these patients said this happened on an average 105 days each year.

Even though patients may not be hiring others to do housework they’re unable to complete, or to care for children on days when they’re too unwell to do so, that doesn’t mean there’s no impact on the patient and those around them, Mr. Vendetti said. “If you ask a family member or a friend for help, that creates a strain in the relationship.”

In terms of resources to address the indirect burden of SCD on careers, Mr. Vendetti pointed out that many states have vocational rehabilitation programs that offer a significant amount of support and assistance to help find a productive work path that still accommodates a chronic illness such as SCD. In Virginia, he said, individuals need to be on disability to avail themselves of the program.

Health care providers can educate themselves about these and other programs. “Most adult sickle cell disease patients didn’t even know they might be eligible” for vocational assistance, he said.

During discussion after the presentation, an audience member pointed out that parents and caregivers of children with SCD are probably also incurring significant indirect costs because of their care-giving burden and that this population should also be studied. Mr. Vendetti agreed. “This is potential that isn’t fulfilled” for all patients and families whose work and personal lives are so profoundly affected by SCD, he said. “This is a dream deferred.”

Mr. Vendetti is employed by Pfizer and is a Pfizer stockholder. A coauthor of the study is a Pfizer consultant.

 

 

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

– Three-quarters of patients with sickle cell disease (SCD) reported missing work in the last year because of disease symptoms, according to results from a single-center study.

While the direct costs of SCD are easy to measure, it’s harder to capture the indirect costs patients may incur from this chronic, progressive disease, which range from lost days at work to the downstream consequences of “presenteeism.”

“Indirect costs are related to things that have value, but it’s a little bit harder to apply an exact value to it,” said Nicholas Vendetti of Pfizer. But this is a critical piece for understanding SCD, he said. “The burden of illness is unknown without productivity costs.”

Mr. Vendetti and his collaborators attempted to capture the indirect costs of SCD, and reported the results of a single-site study at the annual meeting of the Foundation for Sickle Cell Disease Research.

They recruited patients from Virginia Commonwealth University’s adult sickle cell clinic and trained interviewers to conduct structured interviews using the Institute for Medical Technology Assessment Productivity Cost Questionnaire. The interviewers asked about absenteeism, lost work, unpaid work activity, and “presenteeism,” defined as days when participants were at work but experienced decreased work output because of disease symptoms.

In the end, the study enrolled 186 patients aged 18 and older, a figure that “really exceeded what we expected when we started the protocol,” Mr. Vendetti said. Most participants were between the ages of 20 and 60 years – the most productive working years.

About 58% of participants were female. Nearly half (46%) had the HbSS genotype, while 30% had the HbSC genotype. About half (52%) were high school graduates, and about a third had some college. There were no advanced degrees earned in the study population, and 11.5% had not finished high school.

Initial questions about educational status and employment status “highlighted a very interesting aspect of the disease: 43.8% reported that they were currently unable to work as part of their disease process,” Mr. Vendetti said. Just 28% were employed for wages, 3% were self-employed, and about 7% reported being homemakers. The remainder were out of work, were students, or were retired.

Three-quarters of patients reported missing work in the last year because of SCD symptoms. This group reported missing a mean 36.75 days yearly. Assuming the average Virginia hourly wage of $25.53 per hour, this comes to an average of $7,506 in lost wages each year, Mr. Vendetti said.

Presenteeism had a large impact as well. Nearly 73% of patients said they were bothered at work – either psychologically or physically – by their symptoms in the last 4 weeks, and 90% over the past year. These patients estimated they were affected for about 100 working days yearly.

When asked on a scale of 0-10 how much work they were able to get done on days when their SCD was affecting productivity, “most patients are falling into that middle range” of a score of 4-6, Mr. Vendetti said. “Most patients are moderately affected.

“It’s hard to apply a dollar value to that, but it’s easy to see how it could affect the trajectory of your career,” he added.

Another aspect of the indirect cost of the sickle cell disease burden that’s even harder to tease out is whether those affected are unable to complete a significant amount of unpaid work. Again, about three-quarters of patients reported that SCD had affected their ability to do this kind of work, and these patients said this happened on an average 105 days each year.

Even though patients may not be hiring others to do housework they’re unable to complete, or to care for children on days when they’re too unwell to do so, that doesn’t mean there’s no impact on the patient and those around them, Mr. Vendetti said. “If you ask a family member or a friend for help, that creates a strain in the relationship.”

In terms of resources to address the indirect burden of SCD on careers, Mr. Vendetti pointed out that many states have vocational rehabilitation programs that offer a significant amount of support and assistance to help find a productive work path that still accommodates a chronic illness such as SCD. In Virginia, he said, individuals need to be on disability to avail themselves of the program.

Health care providers can educate themselves about these and other programs. “Most adult sickle cell disease patients didn’t even know they might be eligible” for vocational assistance, he said.

During discussion after the presentation, an audience member pointed out that parents and caregivers of children with SCD are probably also incurring significant indirect costs because of their care-giving burden and that this population should also be studied. Mr. Vendetti agreed. “This is potential that isn’t fulfilled” for all patients and families whose work and personal lives are so profoundly affected by SCD, he said. “This is a dream deferred.”

Mr. Vendetti is employed by Pfizer and is a Pfizer stockholder. A coauthor of the study is a Pfizer consultant.

 

 

 

– Three-quarters of patients with sickle cell disease (SCD) reported missing work in the last year because of disease symptoms, according to results from a single-center study.

While the direct costs of SCD are easy to measure, it’s harder to capture the indirect costs patients may incur from this chronic, progressive disease, which range from lost days at work to the downstream consequences of “presenteeism.”

“Indirect costs are related to things that have value, but it’s a little bit harder to apply an exact value to it,” said Nicholas Vendetti of Pfizer. But this is a critical piece for understanding SCD, he said. “The burden of illness is unknown without productivity costs.”

Mr. Vendetti and his collaborators attempted to capture the indirect costs of SCD, and reported the results of a single-site study at the annual meeting of the Foundation for Sickle Cell Disease Research.

They recruited patients from Virginia Commonwealth University’s adult sickle cell clinic and trained interviewers to conduct structured interviews using the Institute for Medical Technology Assessment Productivity Cost Questionnaire. The interviewers asked about absenteeism, lost work, unpaid work activity, and “presenteeism,” defined as days when participants were at work but experienced decreased work output because of disease symptoms.

In the end, the study enrolled 186 patients aged 18 and older, a figure that “really exceeded what we expected when we started the protocol,” Mr. Vendetti said. Most participants were between the ages of 20 and 60 years – the most productive working years.

About 58% of participants were female. Nearly half (46%) had the HbSS genotype, while 30% had the HbSC genotype. About half (52%) were high school graduates, and about a third had some college. There were no advanced degrees earned in the study population, and 11.5% had not finished high school.

Initial questions about educational status and employment status “highlighted a very interesting aspect of the disease: 43.8% reported that they were currently unable to work as part of their disease process,” Mr. Vendetti said. Just 28% were employed for wages, 3% were self-employed, and about 7% reported being homemakers. The remainder were out of work, were students, or were retired.

Three-quarters of patients reported missing work in the last year because of SCD symptoms. This group reported missing a mean 36.75 days yearly. Assuming the average Virginia hourly wage of $25.53 per hour, this comes to an average of $7,506 in lost wages each year, Mr. Vendetti said.

Presenteeism had a large impact as well. Nearly 73% of patients said they were bothered at work – either psychologically or physically – by their symptoms in the last 4 weeks, and 90% over the past year. These patients estimated they were affected for about 100 working days yearly.

When asked on a scale of 0-10 how much work they were able to get done on days when their SCD was affecting productivity, “most patients are falling into that middle range” of a score of 4-6, Mr. Vendetti said. “Most patients are moderately affected.

“It’s hard to apply a dollar value to that, but it’s easy to see how it could affect the trajectory of your career,” he added.

Another aspect of the indirect cost of the sickle cell disease burden that’s even harder to tease out is whether those affected are unable to complete a significant amount of unpaid work. Again, about three-quarters of patients reported that SCD had affected their ability to do this kind of work, and these patients said this happened on an average 105 days each year.

Even though patients may not be hiring others to do housework they’re unable to complete, or to care for children on days when they’re too unwell to do so, that doesn’t mean there’s no impact on the patient and those around them, Mr. Vendetti said. “If you ask a family member or a friend for help, that creates a strain in the relationship.”

In terms of resources to address the indirect burden of SCD on careers, Mr. Vendetti pointed out that many states have vocational rehabilitation programs that offer a significant amount of support and assistance to help find a productive work path that still accommodates a chronic illness such as SCD. In Virginia, he said, individuals need to be on disability to avail themselves of the program.

Health care providers can educate themselves about these and other programs. “Most adult sickle cell disease patients didn’t even know they might be eligible” for vocational assistance, he said.

During discussion after the presentation, an audience member pointed out that parents and caregivers of children with SCD are probably also incurring significant indirect costs because of their care-giving burden and that this population should also be studied. Mr. Vendetti agreed. “This is potential that isn’t fulfilled” for all patients and families whose work and personal lives are so profoundly affected by SCD, he said. “This is a dream deferred.”

Mr. Vendetti is employed by Pfizer and is a Pfizer stockholder. A coauthor of the study is a Pfizer consultant.

 

 

Publications
Publications
Topics
Article Type
Sections
Article Source

REPORTING FROM FSCDR 2018

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Sickle cell patients reported missing work or being affected by their symptoms while working.

Major finding: Three-quarters of patients reported missing work in the last year because of SCD symptoms. This group reported missing a mean 37 days yearly.

Study details: Single-site survey-based study of 186 adults with SCD.

Disclosures: Pfizer sponsored the study. Mr. Vendetti is employed by Pfizer and holds Pfizer stock. A study coauthor is a Pfizer consultant.

Disqus Comments
Default
Use ProPublica

Dual-targeting CAR T active against AML in mice and one man

Article Type
Changed
Fri, 01/04/2019 - 10:27

 

– A novel compound chimeric antigen receptor (cCAR) T-cell construct directed against two different targets may one day serve as a standalone therapy, as a supplement to chemotherapy, or as a bridge to transplant for patients with refractory acute myeloid leukemia (AML), investigators asserted.

To date, however, only one patient – a man with treatment-refractory AML – has been treated with the cCAR T, which contains two independent complete units, one directed against CD33 to target bulky disease and the other targeted against CLL1 on leukemic stem cells.

Neil Osterweil/MDedge News
Dr. Fang Liu
“Our preclinical study has shown that our CLL1/CD33 compound CAR possess consistent, specific, and potent antitumor activity against a variety of CLL1+ and/or CD33+ leukemia cells in vitro and in vivo,” Fang Liu, MD, PhD, of the Chengdu (China) Military General Hospital said at the annual congress of the European Hematology Association.

The patient was a 44-year-old man with AML who remained refractory after four cycles of chemotherapy and had 20% bone marrow blasts. He achieved a complete response after infusion with the cCAR T cells and went on to bone marrow transplant with no evidence of minimal residual disease (MRD) at 3 months of follow-up, Dr. Liu said.

Although anti-CD19 CAR T cells have been demonstrated to have significant efficacy in relapsed or refractory B-cell acute lymphoblastic leukemia, AML is a tougher problem to solve because the heterogeneity of myeloid leukemia cells allows some cells to escape targeting by enhanced T cells, which leads to eventual relapse.

To get around this problem, the investigators created a CAR T with a one-two punch, with one component targeting the antigen CLL1, which is expressed on leukemic stem cells, and a second, separate component targeting CD33, a myeloid marker expressed on bulk AML disease cells in a majority of patients.

They first tested the cCAR T cells against several AML cell lines and primary human AML samples, then in mouse models of human AML.

In vitro assays showed that the construct had specific antitumor activity against cell lines engineered to express either of the target antigens and also against samples from AML patients. In mouse models created with engineered CLL1 or CD33 expressing cell lines and an AML cell line, the cCAR T cells caused significant reductions in tumor burden and led to prolonged survival, Dr. Liu said.

Since CAR T-cell therapy is associated with serious or life-threatening side effects, such as the cytokine-release syndrome, the investigators built an “off switch” into the cCAR T construct that could be activated by CAMPATH, a monoclonal antibody directed against CD52. Introducing this agent into the mice quickly neutralized the cCAR T therapy, Dr. Liu said.

Finally, the investigators tested the construct in the human patient. He received the cCAR T construct after conditioning with fludarabine and cyclophosphamide; he had a complete remission by day 19 after receiving the cells and was MRD negative. He went on to an allogeneic stem cell transplant on day 44, and he remained MRD negative 3 months after transplant.

Side effects associated with the treatment were a grade 1 cytokine release syndrome event, manifesting in fever and chills, lung infection, and red blood cell transfusion dependence but also platelet transfusion independence.

The investigators have initiated a phase 1 trial and plan to enroll 20 patients to further evaluate the efficacy and safety of the cCAR T construct.

The study was supported by iCell Gene Therapeutics. Dr. Liu reported having no conflicts of interest.

SOURCE: Liu F et al. EHA Congress, Abstract S149.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

– A novel compound chimeric antigen receptor (cCAR) T-cell construct directed against two different targets may one day serve as a standalone therapy, as a supplement to chemotherapy, or as a bridge to transplant for patients with refractory acute myeloid leukemia (AML), investigators asserted.

To date, however, only one patient – a man with treatment-refractory AML – has been treated with the cCAR T, which contains two independent complete units, one directed against CD33 to target bulky disease and the other targeted against CLL1 on leukemic stem cells.

Neil Osterweil/MDedge News
Dr. Fang Liu
“Our preclinical study has shown that our CLL1/CD33 compound CAR possess consistent, specific, and potent antitumor activity against a variety of CLL1+ and/or CD33+ leukemia cells in vitro and in vivo,” Fang Liu, MD, PhD, of the Chengdu (China) Military General Hospital said at the annual congress of the European Hematology Association.

The patient was a 44-year-old man with AML who remained refractory after four cycles of chemotherapy and had 20% bone marrow blasts. He achieved a complete response after infusion with the cCAR T cells and went on to bone marrow transplant with no evidence of minimal residual disease (MRD) at 3 months of follow-up, Dr. Liu said.

Although anti-CD19 CAR T cells have been demonstrated to have significant efficacy in relapsed or refractory B-cell acute lymphoblastic leukemia, AML is a tougher problem to solve because the heterogeneity of myeloid leukemia cells allows some cells to escape targeting by enhanced T cells, which leads to eventual relapse.

To get around this problem, the investigators created a CAR T with a one-two punch, with one component targeting the antigen CLL1, which is expressed on leukemic stem cells, and a second, separate component targeting CD33, a myeloid marker expressed on bulk AML disease cells in a majority of patients.

They first tested the cCAR T cells against several AML cell lines and primary human AML samples, then in mouse models of human AML.

In vitro assays showed that the construct had specific antitumor activity against cell lines engineered to express either of the target antigens and also against samples from AML patients. In mouse models created with engineered CLL1 or CD33 expressing cell lines and an AML cell line, the cCAR T cells caused significant reductions in tumor burden and led to prolonged survival, Dr. Liu said.

Since CAR T-cell therapy is associated with serious or life-threatening side effects, such as the cytokine-release syndrome, the investigators built an “off switch” into the cCAR T construct that could be activated by CAMPATH, a monoclonal antibody directed against CD52. Introducing this agent into the mice quickly neutralized the cCAR T therapy, Dr. Liu said.

Finally, the investigators tested the construct in the human patient. He received the cCAR T construct after conditioning with fludarabine and cyclophosphamide; he had a complete remission by day 19 after receiving the cells and was MRD negative. He went on to an allogeneic stem cell transplant on day 44, and he remained MRD negative 3 months after transplant.

Side effects associated with the treatment were a grade 1 cytokine release syndrome event, manifesting in fever and chills, lung infection, and red blood cell transfusion dependence but also platelet transfusion independence.

The investigators have initiated a phase 1 trial and plan to enroll 20 patients to further evaluate the efficacy and safety of the cCAR T construct.

The study was supported by iCell Gene Therapeutics. Dr. Liu reported having no conflicts of interest.

SOURCE: Liu F et al. EHA Congress, Abstract S149.

 

– A novel compound chimeric antigen receptor (cCAR) T-cell construct directed against two different targets may one day serve as a standalone therapy, as a supplement to chemotherapy, or as a bridge to transplant for patients with refractory acute myeloid leukemia (AML), investigators asserted.

To date, however, only one patient – a man with treatment-refractory AML – has been treated with the cCAR T, which contains two independent complete units, one directed against CD33 to target bulky disease and the other targeted against CLL1 on leukemic stem cells.

Neil Osterweil/MDedge News
Dr. Fang Liu
“Our preclinical study has shown that our CLL1/CD33 compound CAR possess consistent, specific, and potent antitumor activity against a variety of CLL1+ and/or CD33+ leukemia cells in vitro and in vivo,” Fang Liu, MD, PhD, of the Chengdu (China) Military General Hospital said at the annual congress of the European Hematology Association.

The patient was a 44-year-old man with AML who remained refractory after four cycles of chemotherapy and had 20% bone marrow blasts. He achieved a complete response after infusion with the cCAR T cells and went on to bone marrow transplant with no evidence of minimal residual disease (MRD) at 3 months of follow-up, Dr. Liu said.

Although anti-CD19 CAR T cells have been demonstrated to have significant efficacy in relapsed or refractory B-cell acute lymphoblastic leukemia, AML is a tougher problem to solve because the heterogeneity of myeloid leukemia cells allows some cells to escape targeting by enhanced T cells, which leads to eventual relapse.

To get around this problem, the investigators created a CAR T with a one-two punch, with one component targeting the antigen CLL1, which is expressed on leukemic stem cells, and a second, separate component targeting CD33, a myeloid marker expressed on bulk AML disease cells in a majority of patients.

They first tested the cCAR T cells against several AML cell lines and primary human AML samples, then in mouse models of human AML.

In vitro assays showed that the construct had specific antitumor activity against cell lines engineered to express either of the target antigens and also against samples from AML patients. In mouse models created with engineered CLL1 or CD33 expressing cell lines and an AML cell line, the cCAR T cells caused significant reductions in tumor burden and led to prolonged survival, Dr. Liu said.

Since CAR T-cell therapy is associated with serious or life-threatening side effects, such as the cytokine-release syndrome, the investigators built an “off switch” into the cCAR T construct that could be activated by CAMPATH, a monoclonal antibody directed against CD52. Introducing this agent into the mice quickly neutralized the cCAR T therapy, Dr. Liu said.

Finally, the investigators tested the construct in the human patient. He received the cCAR T construct after conditioning with fludarabine and cyclophosphamide; he had a complete remission by day 19 after receiving the cells and was MRD negative. He went on to an allogeneic stem cell transplant on day 44, and he remained MRD negative 3 months after transplant.

Side effects associated with the treatment were a grade 1 cytokine release syndrome event, manifesting in fever and chills, lung infection, and red blood cell transfusion dependence but also platelet transfusion independence.

The investigators have initiated a phase 1 trial and plan to enroll 20 patients to further evaluate the efficacy and safety of the cCAR T construct.

The study was supported by iCell Gene Therapeutics. Dr. Liu reported having no conflicts of interest.

SOURCE: Liu F et al. EHA Congress, Abstract S149.

Publications
Publications
Topics
Article Type
Sections
Article Source

REPORTING FROM THE EHA CONGRESS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: The compound CAR T therapy targets CLL1 on leukemic stem cells and CD33 on bulky disease.

Major finding: The only human patient treated with the construct had a complete remission and successful bridge to transplant.

Study details: Preclinical study plus phase 1 data on one patient.

Disclosures: The study was supported by iCell Gene Therapeutics. Dr. Liu reported having no conflicts of interest.

Source: Liu F et al. EHA Congress, Abstract S149.

Disqus Comments
Default
Use ProPublica

Midlife retinopathy predicts ischemic stroke

Technology to the rescue
Article Type
Changed
Mon, 01/07/2019 - 13:13

 

– The more severe retinopathy is at midlife, the greater the risk of ischemic stroke – particularly lacunar stroke – later on, according to investigators from Johns Hopkins University, Baltimore.

Retinopathy has been associated with strokes before, but the investigators wanted to see whether it could predict stroke type. The idea is that microvascular changes in the retina could mirror microvascular changes in the brain that could lead to stroke.

The positive findings mean that “retinal microvasculature may serve as a biomarker for cerebrovascular health. Retinal imaging may enable further risk stratification of cerebrovascular and neurodegenerative diseases for early, intensive preventive interventions,” said lead investigator Michelle Lin, MD, a stroke fellow at Johns Hopkins.

A full evaluation is beyond the scope of a quick ophthalmoscope check up in the office. The advent of smartphone fundoscopic cameras and optical coherence tomography – which provides images of retinal vasculature at micrometer-level resolution – will likely help retinal imaging reach its full potential in the clinic, she said at the annual meeting of the American Academy of Neurology.

Dr. Lin and her team reviewed 10,468 participants in the Atherosclerosis Risk in Communities Study database. They had baseline retinal photographs from 1993-1995 when they were 45-65 years old. The photos were checked for four types of retinopathy: arteriovenous nicking, focal arteriolar narrowing, retinal microaneurysms, and retinal hemorrhage. The presence of each one was given a score of 1, yielding a retinopathy severity score of 0-4, with 4 meaning subjects had all four types.

Over a median follow-up period of 18.8 years, 578 participants had an ischemic stroke, including 114 lacunar strokes, 292 nonlacunar strokes, and 172 cardioembolic strokes. Hemorrhagic strokes occurred in 95 subjects.

The incidence of ischemic stroke increased with the severity of baseline retinopathy, from 2.7 strokes per 1,000 participant-years among those with no retinopathy to 10.2 among those with a severity score of 3 or higher (P less than .001). The 15-year cumulative risk of ischemic stroke with any retinopathy was 3.4% versus 1.6% with no retinopathy (P less than .001).

After adjustment for age, sex, race, comorbidities, and other confounders, retinal microvasculopathy associated positively with ischemic stroke, especially lacunar stroke (adjusted hazard ratio, 1.84; 95% confidence interval, 1.23-2.74; P = .005).

Trends linking retinopathy severity to the incidence of nonlacunar, cardioembolic, and hemorrhagic strokes were not statistically significant. Factors associated with higher retinopathy grade included older age, black race, hypertension, and diabetes, among others.

There were slightly more women than men in the review. The average age at baseline was 59 years. Patients with stroke histories at baseline were excluded.

The work was funded by the National Institutes of Health. The investigators had no disclosures.
 

SOURCE: Lin MP et al. Neurology. 2018 Apr;90(15 Suppl.):CCI.001.

Body

 

The findings are really not surprising. Retinal microvascular changes are a sign of end-organ damage. Small-vessel disease in the eye, small-vessel disease in the brain. This makes a lot of sense.

The question is: Which magic wand do we need to be able to measure and calculate these changes? These are not changes you are going to be able to detect easily when looking at the ocular fundus with your ophthalmoscope. These are very subtle changes we are taking about.

There’s new technology, like optical coherence tomography, and this is what will save us. People are working to provide us tools to automatically calculate retinal microvascular changes from fundus photographs. I have no doubt that within the next 2-3 years we will be able to use this technology. We are almost there; we are in the hands of engineers.

Valerie Biousse, MD , is a professor of neuro-ophthalmology at Emory University, Atlanta. She had no relevant disclosures.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event
Related Articles
Body

 

The findings are really not surprising. Retinal microvascular changes are a sign of end-organ damage. Small-vessel disease in the eye, small-vessel disease in the brain. This makes a lot of sense.

The question is: Which magic wand do we need to be able to measure and calculate these changes? These are not changes you are going to be able to detect easily when looking at the ocular fundus with your ophthalmoscope. These are very subtle changes we are taking about.

There’s new technology, like optical coherence tomography, and this is what will save us. People are working to provide us tools to automatically calculate retinal microvascular changes from fundus photographs. I have no doubt that within the next 2-3 years we will be able to use this technology. We are almost there; we are in the hands of engineers.

Valerie Biousse, MD , is a professor of neuro-ophthalmology at Emory University, Atlanta. She had no relevant disclosures.

Body

 

The findings are really not surprising. Retinal microvascular changes are a sign of end-organ damage. Small-vessel disease in the eye, small-vessel disease in the brain. This makes a lot of sense.

The question is: Which magic wand do we need to be able to measure and calculate these changes? These are not changes you are going to be able to detect easily when looking at the ocular fundus with your ophthalmoscope. These are very subtle changes we are taking about.

There’s new technology, like optical coherence tomography, and this is what will save us. People are working to provide us tools to automatically calculate retinal microvascular changes from fundus photographs. I have no doubt that within the next 2-3 years we will be able to use this technology. We are almost there; we are in the hands of engineers.

Valerie Biousse, MD , is a professor of neuro-ophthalmology at Emory University, Atlanta. She had no relevant disclosures.

Title
Technology to the rescue
Technology to the rescue

 

– The more severe retinopathy is at midlife, the greater the risk of ischemic stroke – particularly lacunar stroke – later on, according to investigators from Johns Hopkins University, Baltimore.

Retinopathy has been associated with strokes before, but the investigators wanted to see whether it could predict stroke type. The idea is that microvascular changes in the retina could mirror microvascular changes in the brain that could lead to stroke.

The positive findings mean that “retinal microvasculature may serve as a biomarker for cerebrovascular health. Retinal imaging may enable further risk stratification of cerebrovascular and neurodegenerative diseases for early, intensive preventive interventions,” said lead investigator Michelle Lin, MD, a stroke fellow at Johns Hopkins.

A full evaluation is beyond the scope of a quick ophthalmoscope check up in the office. The advent of smartphone fundoscopic cameras and optical coherence tomography – which provides images of retinal vasculature at micrometer-level resolution – will likely help retinal imaging reach its full potential in the clinic, she said at the annual meeting of the American Academy of Neurology.

Dr. Lin and her team reviewed 10,468 participants in the Atherosclerosis Risk in Communities Study database. They had baseline retinal photographs from 1993-1995 when they were 45-65 years old. The photos were checked for four types of retinopathy: arteriovenous nicking, focal arteriolar narrowing, retinal microaneurysms, and retinal hemorrhage. The presence of each one was given a score of 1, yielding a retinopathy severity score of 0-4, with 4 meaning subjects had all four types.

Over a median follow-up period of 18.8 years, 578 participants had an ischemic stroke, including 114 lacunar strokes, 292 nonlacunar strokes, and 172 cardioembolic strokes. Hemorrhagic strokes occurred in 95 subjects.

The incidence of ischemic stroke increased with the severity of baseline retinopathy, from 2.7 strokes per 1,000 participant-years among those with no retinopathy to 10.2 among those with a severity score of 3 or higher (P less than .001). The 15-year cumulative risk of ischemic stroke with any retinopathy was 3.4% versus 1.6% with no retinopathy (P less than .001).

After adjustment for age, sex, race, comorbidities, and other confounders, retinal microvasculopathy associated positively with ischemic stroke, especially lacunar stroke (adjusted hazard ratio, 1.84; 95% confidence interval, 1.23-2.74; P = .005).

Trends linking retinopathy severity to the incidence of nonlacunar, cardioembolic, and hemorrhagic strokes were not statistically significant. Factors associated with higher retinopathy grade included older age, black race, hypertension, and diabetes, among others.

There were slightly more women than men in the review. The average age at baseline was 59 years. Patients with stroke histories at baseline were excluded.

The work was funded by the National Institutes of Health. The investigators had no disclosures.
 

SOURCE: Lin MP et al. Neurology. 2018 Apr;90(15 Suppl.):CCI.001.

 

– The more severe retinopathy is at midlife, the greater the risk of ischemic stroke – particularly lacunar stroke – later on, according to investigators from Johns Hopkins University, Baltimore.

Retinopathy has been associated with strokes before, but the investigators wanted to see whether it could predict stroke type. The idea is that microvascular changes in the retina could mirror microvascular changes in the brain that could lead to stroke.

The positive findings mean that “retinal microvasculature may serve as a biomarker for cerebrovascular health. Retinal imaging may enable further risk stratification of cerebrovascular and neurodegenerative diseases for early, intensive preventive interventions,” said lead investigator Michelle Lin, MD, a stroke fellow at Johns Hopkins.

A full evaluation is beyond the scope of a quick ophthalmoscope check up in the office. The advent of smartphone fundoscopic cameras and optical coherence tomography – which provides images of retinal vasculature at micrometer-level resolution – will likely help retinal imaging reach its full potential in the clinic, she said at the annual meeting of the American Academy of Neurology.

Dr. Lin and her team reviewed 10,468 participants in the Atherosclerosis Risk in Communities Study database. They had baseline retinal photographs from 1993-1995 when they were 45-65 years old. The photos were checked for four types of retinopathy: arteriovenous nicking, focal arteriolar narrowing, retinal microaneurysms, and retinal hemorrhage. The presence of each one was given a score of 1, yielding a retinopathy severity score of 0-4, with 4 meaning subjects had all four types.

Over a median follow-up period of 18.8 years, 578 participants had an ischemic stroke, including 114 lacunar strokes, 292 nonlacunar strokes, and 172 cardioembolic strokes. Hemorrhagic strokes occurred in 95 subjects.

The incidence of ischemic stroke increased with the severity of baseline retinopathy, from 2.7 strokes per 1,000 participant-years among those with no retinopathy to 10.2 among those with a severity score of 3 or higher (P less than .001). The 15-year cumulative risk of ischemic stroke with any retinopathy was 3.4% versus 1.6% with no retinopathy (P less than .001).

After adjustment for age, sex, race, comorbidities, and other confounders, retinal microvasculopathy associated positively with ischemic stroke, especially lacunar stroke (adjusted hazard ratio, 1.84; 95% confidence interval, 1.23-2.74; P = .005).

Trends linking retinopathy severity to the incidence of nonlacunar, cardioembolic, and hemorrhagic strokes were not statistically significant. Factors associated with higher retinopathy grade included older age, black race, hypertension, and diabetes, among others.

There were slightly more women than men in the review. The average age at baseline was 59 years. Patients with stroke histories at baseline were excluded.

The work was funded by the National Institutes of Health. The investigators had no disclosures.
 

SOURCE: Lin MP et al. Neurology. 2018 Apr;90(15 Suppl.):CCI.001.

Publications
Publications
Topics
Article Type
Click for Credit Status
Ready
Sections
Article Source

REPORTING FROM AAN 2018

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: The more severe retinopathy is at midlife, the greater the risk of ischemic stroke later on.

Major finding: The incidence of ischemic stroke increased with the severity of baseline retinopathy, from 2.7 strokes per 1,000 participant-years with no retinopathy to 10.2 with a severity score of 3 or higher (P less than .001).

Study details: Review of 10,468 subjects in a population-based cohort study.

Disclosures: The work was funded by the National Institutes of Health. The investigators had no disclosures.

Source: Lin MP et al. Neurology. 2018 Apr;90(15 Suppl.):CCI.001.

Disqus Comments
Default
Use ProPublica

Biologics improve axial spondyloarthritis patients’ work performance

Article Type
Changed
Fri, 01/18/2019 - 17:45

LIVERPOOL, ENGLAND – Biologic therapy for axial spondyloarthritis can improve individuals’ work productivity and decrease the extent that the disease impairs overall work and overall activity, new data from the British Society for Rheumatology Biologics Register in Ankylosing Spondylitis have shown.

A variety of work outcomes on the Work Productivity and Impairment Specific Health Problem (WPAI-SHP) questionnaire improved to a significantly greater extent with biologics use than without. Presenteeism, or working while sick, improved by a mean of –9.4%. Overall work impairment reduced by 13.9%, and overall activity impairment decreased by 19.2%. There was no great effect on absenteeism, however, with a mean difference in improvement of –1.5% between the groups.

Sara Freeman/MDedge News
Dr. Joanna Shim
“In today’s society, the importance of work is strongly emphasized, and this is no different for people with axial spondyloarthritis [axSpA],” Joanna Shim, PhD, said at the British Society for Rheumatology annual conference.

“Research into this chronic condition has shown that it has detrimental impact on one’s ability to work,” she added. People may take sick leave and be less productive at work, which can have a psychological effect and cause worry about job loss.

While there is “strong evidence” to show that biologic therapy can improve disease activity in those with axSpA, there is equivocal evidence on whether it has any effect on work outcomes, explained Dr. Shim, a physiotherapist and a postdoctoral research fellow in the Epidemiology Group at the University of Aberdeen (Scotland).

The British Society for Rheumatology Biologics Register in Ankylosing Spondylitis (BSRBR-AS) started recruiting patients with axSpA from 84 centers across the United Kingdom in 2012 and there are now more than 2,500 participants included in the register. Similar to other biologics registers run under the auspices of the British Society for Rheumatology, the BSRBR-AS consists of two cohorts of patients, one who are about to start biologic therapy (with Enbrel [etanercept], Humira [adalimumab], or Cimzia [certolizumab pegol]) and one not receiving biologics.

The current analysis of 577 participants included 161 who had been treated with biologics and 416 who had not. Dr. Shim pointed out that people treated with biologics were younger (42 vs. 47 years), had shorter disease duration (7.7 vs. 12.3 years), and were more likely to be smokers (21% vs. 11%) than were those who had not taken biologics. Biologics-treated patients also had higher mean baseline disease activity measured by the Bath Ankylosing Spondylitis Disease Activity Index (5.8 vs. 3.3), poorer function as measured by the Bath Ankylosing Spondylitis Functional Index (5.4 vs. 2.7), and worse overall Bath Ankylosing Spondylitis Global status scores (6.7 vs. 3.2).

“That’s the reason why they are given biologic therapy in the first place,” Dr. Shim said. To even out these differences, the investigators used propensity score matching before assessing work outcomes with the WPAI-SHP questionnaire. This consists of four components that are assessed in the last 7 days: absenteeism, presenteeism, overall work impairments (a combination of absenteeism and presenteeism), and overall activity impairment.

At recruitment, the investigators found that patients who later received biologics had greater impairment in work outcomes than did patients who later did not receive biologics. Patients who went on to receive biologics reported more absenteeism (13.0% vs. 3.0%), presenteeism (41.5% vs. 19.9%), overall work impairment (43.3% vs. 20.6%), and overall activity impairment (59.9% vs. 32.5%).

“Despite the improvements that we observed, there is still substantial unmet need, in the sense that people in the biologic therapy group are still experiencing significantly higher work impairments, compared to people in the nonbiologic therapy group,” Dr. Shim said. She added that, ideally, there should be no work impairment at all.

Dr. Shim and her associates combined the new BSRBR-AS findings with data from four prior randomized, controlled studies that met criteria for a meta-analysis. The results showed a mean difference between biologic and nonbiologic treatment of –5.35 on presenteeism, –11.20 on overall work impairment, and –12.13 on overall activity impairment. Again, there was little overall effect on absenteeism, with a mean difference of 0.84 between the groups.

The apparent lack of effect of biologic treatment on absenteeism could be from several reasons, one being that absenteeism was reportedly low in the BSRBR-AS and in other studies. Also, there is some evidence that presenteeism precedes absenteeism. The type of work done or number of allowed sick days may also play a role, Dr. Shim suggested.

“Work is a very important economic and social outcome,” Dr. Shim said. “We propose that future work should look into the assessment of work outcomes as standard measures,” in order to generate a greater evidence base around pharmacologic and nonpharmacologic approaches to improve work outcomes.

The BSRBR-AS is funded by the British Society for Rheumatology, which in turn has received function from AbbVie, Pfizer, and UCB. Dr. Shim reported that she had no conflicts of interest in relation to her presentation.

 

 

SOURCE: Shim J et al. Rheumatology. 2018;57[Suppl. 3]:key075.181.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event
Related Articles

LIVERPOOL, ENGLAND – Biologic therapy for axial spondyloarthritis can improve individuals’ work productivity and decrease the extent that the disease impairs overall work and overall activity, new data from the British Society for Rheumatology Biologics Register in Ankylosing Spondylitis have shown.

A variety of work outcomes on the Work Productivity and Impairment Specific Health Problem (WPAI-SHP) questionnaire improved to a significantly greater extent with biologics use than without. Presenteeism, or working while sick, improved by a mean of –9.4%. Overall work impairment reduced by 13.9%, and overall activity impairment decreased by 19.2%. There was no great effect on absenteeism, however, with a mean difference in improvement of –1.5% between the groups.

Sara Freeman/MDedge News
Dr. Joanna Shim
“In today’s society, the importance of work is strongly emphasized, and this is no different for people with axial spondyloarthritis [axSpA],” Joanna Shim, PhD, said at the British Society for Rheumatology annual conference.

“Research into this chronic condition has shown that it has detrimental impact on one’s ability to work,” she added. People may take sick leave and be less productive at work, which can have a psychological effect and cause worry about job loss.

While there is “strong evidence” to show that biologic therapy can improve disease activity in those with axSpA, there is equivocal evidence on whether it has any effect on work outcomes, explained Dr. Shim, a physiotherapist and a postdoctoral research fellow in the Epidemiology Group at the University of Aberdeen (Scotland).

The British Society for Rheumatology Biologics Register in Ankylosing Spondylitis (BSRBR-AS) started recruiting patients with axSpA from 84 centers across the United Kingdom in 2012 and there are now more than 2,500 participants included in the register. Similar to other biologics registers run under the auspices of the British Society for Rheumatology, the BSRBR-AS consists of two cohorts of patients, one who are about to start biologic therapy (with Enbrel [etanercept], Humira [adalimumab], or Cimzia [certolizumab pegol]) and one not receiving biologics.

The current analysis of 577 participants included 161 who had been treated with biologics and 416 who had not. Dr. Shim pointed out that people treated with biologics were younger (42 vs. 47 years), had shorter disease duration (7.7 vs. 12.3 years), and were more likely to be smokers (21% vs. 11%) than were those who had not taken biologics. Biologics-treated patients also had higher mean baseline disease activity measured by the Bath Ankylosing Spondylitis Disease Activity Index (5.8 vs. 3.3), poorer function as measured by the Bath Ankylosing Spondylitis Functional Index (5.4 vs. 2.7), and worse overall Bath Ankylosing Spondylitis Global status scores (6.7 vs. 3.2).

“That’s the reason why they are given biologic therapy in the first place,” Dr. Shim said. To even out these differences, the investigators used propensity score matching before assessing work outcomes with the WPAI-SHP questionnaire. This consists of four components that are assessed in the last 7 days: absenteeism, presenteeism, overall work impairments (a combination of absenteeism and presenteeism), and overall activity impairment.

At recruitment, the investigators found that patients who later received biologics had greater impairment in work outcomes than did patients who later did not receive biologics. Patients who went on to receive biologics reported more absenteeism (13.0% vs. 3.0%), presenteeism (41.5% vs. 19.9%), overall work impairment (43.3% vs. 20.6%), and overall activity impairment (59.9% vs. 32.5%).

“Despite the improvements that we observed, there is still substantial unmet need, in the sense that people in the biologic therapy group are still experiencing significantly higher work impairments, compared to people in the nonbiologic therapy group,” Dr. Shim said. She added that, ideally, there should be no work impairment at all.

Dr. Shim and her associates combined the new BSRBR-AS findings with data from four prior randomized, controlled studies that met criteria for a meta-analysis. The results showed a mean difference between biologic and nonbiologic treatment of –5.35 on presenteeism, –11.20 on overall work impairment, and –12.13 on overall activity impairment. Again, there was little overall effect on absenteeism, with a mean difference of 0.84 between the groups.

The apparent lack of effect of biologic treatment on absenteeism could be from several reasons, one being that absenteeism was reportedly low in the BSRBR-AS and in other studies. Also, there is some evidence that presenteeism precedes absenteeism. The type of work done or number of allowed sick days may also play a role, Dr. Shim suggested.

“Work is a very important economic and social outcome,” Dr. Shim said. “We propose that future work should look into the assessment of work outcomes as standard measures,” in order to generate a greater evidence base around pharmacologic and nonpharmacologic approaches to improve work outcomes.

The BSRBR-AS is funded by the British Society for Rheumatology, which in turn has received function from AbbVie, Pfizer, and UCB. Dr. Shim reported that she had no conflicts of interest in relation to her presentation.

 

 

SOURCE: Shim J et al. Rheumatology. 2018;57[Suppl. 3]:key075.181.

LIVERPOOL, ENGLAND – Biologic therapy for axial spondyloarthritis can improve individuals’ work productivity and decrease the extent that the disease impairs overall work and overall activity, new data from the British Society for Rheumatology Biologics Register in Ankylosing Spondylitis have shown.

A variety of work outcomes on the Work Productivity and Impairment Specific Health Problem (WPAI-SHP) questionnaire improved to a significantly greater extent with biologics use than without. Presenteeism, or working while sick, improved by a mean of –9.4%. Overall work impairment reduced by 13.9%, and overall activity impairment decreased by 19.2%. There was no great effect on absenteeism, however, with a mean difference in improvement of –1.5% between the groups.

Sara Freeman/MDedge News
Dr. Joanna Shim
“In today’s society, the importance of work is strongly emphasized, and this is no different for people with axial spondyloarthritis [axSpA],” Joanna Shim, PhD, said at the British Society for Rheumatology annual conference.

“Research into this chronic condition has shown that it has detrimental impact on one’s ability to work,” she added. People may take sick leave and be less productive at work, which can have a psychological effect and cause worry about job loss.

While there is “strong evidence” to show that biologic therapy can improve disease activity in those with axSpA, there is equivocal evidence on whether it has any effect on work outcomes, explained Dr. Shim, a physiotherapist and a postdoctoral research fellow in the Epidemiology Group at the University of Aberdeen (Scotland).

The British Society for Rheumatology Biologics Register in Ankylosing Spondylitis (BSRBR-AS) started recruiting patients with axSpA from 84 centers across the United Kingdom in 2012 and there are now more than 2,500 participants included in the register. Similar to other biologics registers run under the auspices of the British Society for Rheumatology, the BSRBR-AS consists of two cohorts of patients, one who are about to start biologic therapy (with Enbrel [etanercept], Humira [adalimumab], or Cimzia [certolizumab pegol]) and one not receiving biologics.

The current analysis of 577 participants included 161 who had been treated with biologics and 416 who had not. Dr. Shim pointed out that people treated with biologics were younger (42 vs. 47 years), had shorter disease duration (7.7 vs. 12.3 years), and were more likely to be smokers (21% vs. 11%) than were those who had not taken biologics. Biologics-treated patients also had higher mean baseline disease activity measured by the Bath Ankylosing Spondylitis Disease Activity Index (5.8 vs. 3.3), poorer function as measured by the Bath Ankylosing Spondylitis Functional Index (5.4 vs. 2.7), and worse overall Bath Ankylosing Spondylitis Global status scores (6.7 vs. 3.2).

“That’s the reason why they are given biologic therapy in the first place,” Dr. Shim said. To even out these differences, the investigators used propensity score matching before assessing work outcomes with the WPAI-SHP questionnaire. This consists of four components that are assessed in the last 7 days: absenteeism, presenteeism, overall work impairments (a combination of absenteeism and presenteeism), and overall activity impairment.

At recruitment, the investigators found that patients who later received biologics had greater impairment in work outcomes than did patients who later did not receive biologics. Patients who went on to receive biologics reported more absenteeism (13.0% vs. 3.0%), presenteeism (41.5% vs. 19.9%), overall work impairment (43.3% vs. 20.6%), and overall activity impairment (59.9% vs. 32.5%).

“Despite the improvements that we observed, there is still substantial unmet need, in the sense that people in the biologic therapy group are still experiencing significantly higher work impairments, compared to people in the nonbiologic therapy group,” Dr. Shim said. She added that, ideally, there should be no work impairment at all.

Dr. Shim and her associates combined the new BSRBR-AS findings with data from four prior randomized, controlled studies that met criteria for a meta-analysis. The results showed a mean difference between biologic and nonbiologic treatment of –5.35 on presenteeism, –11.20 on overall work impairment, and –12.13 on overall activity impairment. Again, there was little overall effect on absenteeism, with a mean difference of 0.84 between the groups.

The apparent lack of effect of biologic treatment on absenteeism could be from several reasons, one being that absenteeism was reportedly low in the BSRBR-AS and in other studies. Also, there is some evidence that presenteeism precedes absenteeism. The type of work done or number of allowed sick days may also play a role, Dr. Shim suggested.

“Work is a very important economic and social outcome,” Dr. Shim said. “We propose that future work should look into the assessment of work outcomes as standard measures,” in order to generate a greater evidence base around pharmacologic and nonpharmacologic approaches to improve work outcomes.

The BSRBR-AS is funded by the British Society for Rheumatology, which in turn has received function from AbbVie, Pfizer, and UCB. Dr. Shim reported that she had no conflicts of interest in relation to her presentation.

 

 

SOURCE: Shim J et al. Rheumatology. 2018;57[Suppl. 3]:key075.181.

Publications
Publications
Topics
Article Type
Sections
Article Source

REPORTING FROM RHEUMATOLOGY 2018

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Vitals

 

Key clinical point: Treatment with biologic therapy led to improved work outcomes to a greater extent over time than in patients who did not take biologics.

Major finding: Presenteeism improved by a mean of –9.4%, overall work impairment reduced by 13.9%, and overall activity impairment decreased by 19.2%.

Study details: 577 patients registered in BSRBR-AS (the British Society for Rheumatology Biologics Register in Ankylosing Spondylitis).

Disclosures: The BSRBR-AS is funded by the British Society for Rheumatology, which in turn has received function from AbbVie, Pfizer, and UCB. Dr. Shim reported that she has no conflicts of interest in relation to her presentation.

Source: Shim J et al. Rheumatology. 2018;57(Suppl. 3):key075.181.

Disqus Comments
Default
Use ProPublica

ADA punts photography ban to presenters

Article Type
Changed
Tue, 05/03/2022 - 15:18

 

Last year’s American Diabetes Association annual meeting gobbled up a lot of social media attention, most of it criticizing the organization’s ban on photography in sessions. This year, it’s the presenters who’ll be in the position of deciding whether to allow photography.

During ADA 2017, attendees tweeting photos of presentation slides were surprised to get tweets from the association saying, “Thanks for joining us! Photography isn’t allowed at #2017ADA presentations; we’d appreciate if you could delete this tweet.”

Social media responded with fury, news outlets wrote about it, and the uproar came to dominate public perception of the meeting. While the ADA claimed the policy was to protect intellectual property of unpublished data, critics countered that instant information is the norm and that medical innovation depends on it.

This year, it will be up to the presenters to decide whether their slides can be photographed and shared. The organizers state: “Each presenter/study author will announce, verbally and visually on a slide at the beginning of their presentation, whether or not he/she approves of photos being taken of their slides.”

So watch for the slides, and share your #2018ADA experiences and reactions with us at @ClinEndoNews.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

Last year’s American Diabetes Association annual meeting gobbled up a lot of social media attention, most of it criticizing the organization’s ban on photography in sessions. This year, it’s the presenters who’ll be in the position of deciding whether to allow photography.

During ADA 2017, attendees tweeting photos of presentation slides were surprised to get tweets from the association saying, “Thanks for joining us! Photography isn’t allowed at #2017ADA presentations; we’d appreciate if you could delete this tweet.”

Social media responded with fury, news outlets wrote about it, and the uproar came to dominate public perception of the meeting. While the ADA claimed the policy was to protect intellectual property of unpublished data, critics countered that instant information is the norm and that medical innovation depends on it.

This year, it will be up to the presenters to decide whether their slides can be photographed and shared. The organizers state: “Each presenter/study author will announce, verbally and visually on a slide at the beginning of their presentation, whether or not he/she approves of photos being taken of their slides.”

So watch for the slides, and share your #2018ADA experiences and reactions with us at @ClinEndoNews.

 

Last year’s American Diabetes Association annual meeting gobbled up a lot of social media attention, most of it criticizing the organization’s ban on photography in sessions. This year, it’s the presenters who’ll be in the position of deciding whether to allow photography.

During ADA 2017, attendees tweeting photos of presentation slides were surprised to get tweets from the association saying, “Thanks for joining us! Photography isn’t allowed at #2017ADA presentations; we’d appreciate if you could delete this tweet.”

Social media responded with fury, news outlets wrote about it, and the uproar came to dominate public perception of the meeting. While the ADA claimed the policy was to protect intellectual property of unpublished data, critics countered that instant information is the norm and that medical innovation depends on it.

This year, it will be up to the presenters to decide whether their slides can be photographed and shared. The organizers state: “Each presenter/study author will announce, verbally and visually on a slide at the beginning of their presentation, whether or not he/she approves of photos being taken of their slides.”

So watch for the slides, and share your #2018ADA experiences and reactions with us at @ClinEndoNews.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Will a cocaine epidemic follow the opioid crisis?

Article Type
Changed
Fri, 01/18/2019 - 17:45

 

Ongoing efforts to understand the impact of cocaine on the brain and on behavior have gained considerable momentum over the past few decades. It was only 30 years ago when cocaine was widely considered both safe and nonaddicting – “the champagne” of drugs.1 Progress has been steady since the cocaine-dopamine depletion theory was proposed, and ultimately supported by functional and PET imaging.2,3

Additional discoveries promise further insights into the neuroscience of addiction, pleasure, and mood. While cocaine use, abuse, and dependence might seem relatively quiescent, compared with the scourge of opiate-related deaths and addiction, it remains a public health concern – and now is the second-leading cause of drug deaths. Cocaine cultivation, smuggling, use, and the number of first-time users are all escalating.4

These developments suggest that cocaine problems might get much worse and beg an important question: Will new research give us insight into better solutions?

What the neuroscience shows

As discussed, we already know quite a bit about the neuroscience behind cocaine addiction. The positron emission tomography studies conducted by Nora D. Volkow, MD, and her associates have shown long-lasting changes in abstinent cocaine addicts. Specifically, their findings clearly demonstrated that cocaine changes the brain and depletes dopamine-rich areas. Furthermore, dopamine recovery is negligible after months of abstinence.5

However, large gaps in our understanding remain. The realm of epigenetic study and protein expression behind abuse will be key in bridging our understanding of phenotype to genotype. A recent article by Eric J. Nestler, MD, PhD, and his research team, published in the Journal of Biological Psychiatry and titled “Cocaine self-administration alters transcriptome-wide responses in the brain’s reward circuitry,” offers exciting new insights (Biol Psychiatry. 2018 Apr. doi: 10.1016/jbiopsych.2018.04.009).

The study, led by Deena M. Walker, PhD, offers perhaps the most complete illumination to date of the genetic and epigenetic changes seen in the brain after cocaine self-administration and application.

Dr. Walker and her associates used a mouse model and sorted them into one of several groups. One group examined self-administered acute cocaine exposure only, with the mice immediately harvested thereafter. Two longer-term groups included one that had cocaine exposure with prolonged (30-day) withdrawal followed by context re-exposure (context re-exposure defined as being placed back in the special chamber and lighting they first received cocaine in) and another that had a cocaine exposure with prolonged withdrawal followed by both context and cocaine re-exposure.

The researchers also ran a parallel set of control groups substituting saline for cocaine with otherwise identical durations of observation and context re-exposure. Reward-related brain regions were harvested from each subject and examined with RNA-sequencing analysis to investigate the full genomic/transcriptomic profile of each. Pairwise comparison of the various experimental groups against the control groups (for example, the theoretical baseline of genetic expression in a non–cocaine-exposed brain) uncovered telling patterns, which the investigators aptly described as a comprehensive picture of transcriptome-wide change cocaine causes within the reward circuit.

The novel and creative approach used by Dr. Walker and her associates allowed them to uncover a wealth of clinically significant findings. Much could be said about their spotlighting of specific gene/protein targets for potential future pharmacological therapies toward cocaine treatment – an area that is indeed in sore need of invention. While we have highly efficacious medications for overdose and chronic treatment of opiate abuse, the landscape of treatment options for cocaine is far bleaker and shrouded in theory. With that in mind, perhaps the most salient take-home point is the evidence that cocaine, even after one exposure/withdrawal event, causes a dramatic rewiring in the very way genes are expressed across the reward circuit. The researchers found large shifts in the patterns of genetic transcription, unique and specific to discrete regions of the examined brain tissue, such as the ventral tegmental area, ventral hippocampus, and basolateral amygdala.

More interestingly, similar patterns of these genetic alternations were observed based on the exact history of the cocaine exposure. Dr. Walker and her associates concluded that the withdrawal phase and context re-exposure appear to be crucial components in the re-sculpting of the transcriptomic profile of the reward circuity.

The brain is unprepared by evolution for the reinforcing and reorganizing effects of cocaine. Clinicians, too, have learned that cocaine is addicting and can quickly replace drives such as food, water, sex, and survival. These new data from Dr. Nestler’s team reinforce the importance of prevention. In addition, they are reminders to physicians that cocaine causes changes in brain and behavior that are persistent and not necessarily reversible. Patterns of transcriptomic change are alarming enough and have only recent begun to be fleshed out, but patterns of global substance use trends suggest that we need to begin cocaine prevention activities.
 

 

 

Is another cocaine epidemic inevitable?

The late David F. Musto, MD, who was revered as both expert medical historian and physician at Yale University, New Haven, Conn., offered perhaps the most poignant observation in this regard: He argued that almost every opiate epidemic seems to transition into a psychostimulant epidemic.6 Experts have been looking at cocaine and methamphetamine as a way to try to understand the current opioid epidemic. Indeed, the Centers for Disease Control and Prevention’s most recent report on emerging trends in cocaine use shows numerous, concerning upticks in several realms germane to a possible emerging epidemic. One of the more upstream concerns is a gigantic spike in the shear production of coca leaves and cocaine thought to be occurring in Colombia (the principal source of cocaine in the United States). Current U.S. government estimates based on seizure rates from 2016 indicate that Colombia is producing about 910 metric tons of export quality cocaine. That represents a large increase from the 670-ton estimate the year before and the 325-ton estimate the year before that.

Similarly, a sharp rise in cocaine-related deaths, an approximate 52% increase, has been charted from 2015 to 2016. This finding is likely related to the growing presence of adulterants, such as fentanyl and carfentanil, found in seized cocaine samples. However, a rise in first-time cocaine users in the past year, which, according to the National Survey on Drug Use and Health, is up by about 12% (1.1 million people) in the 2015-2016 period, shows that the danger of cocaine-related deaths might not lie solely in adulteration but also increases in use. These signals might herald a grim return of cocaine to the center stage of public health, a development that would be an encore of the crack cocaine epidemic experienced throughout the 1980s and early 1990s.

Dr. Michael L. Wenzinger

All the above findings support cocaine as an agent of swift and massive change to our reward systems that might be poised to again surge across the United States at epidemic levels. Given this insight into just how extensively it rewires brains and the unfortunate truth that direct pharmacotherapy treatments remain mostly theoretical, it is evident that the best course of action is simply to keep cocaine from ever reaching the brain in the first place. Prevention does work, and these findings underline the importance of that message. Direct psychoeducation, awareness programs, and deterrence are the best defense we can offer to our patients at this time. In addition to these tried and true techniques, fascinating new models of prevention for cocaine abuse also are in development: vaccines. Synthesized by binding cocaine to inert proteins, these vaccines are designed to prevent addiction by training the immune system to bind cocaine and thus prevent it from crossing the blood brain barrier.7 Currently approved for clinical study in humans, these might offer a game-changing new method in the prevention of substance abuse.

 

Dr. Mark S. Gold


In summary, continued research has enriched us with a deeper appreciation of just how profoundly cocaine, even after a single exposure, rewires the brain. Some people might have a cavalier attitude about drugs and even use terms such as experimentation to describe teen use, but cocaine is not cannabis. Not only initial cocaine self-administration, but also withdrawal and context of use (a bathroom, a bar table, a countertop) all serve to debase the natural transcriptome balance of the brain’s reward system. Our knowledge of what exactly contributes to the path of the cocaine addiction has grown, but options for how to treat cocaine overdose and addiction remain slim. This is particularly concerning, as history and data indicate a likelihood that a cocaine epidemic might come on the heels of the opiate epidemic. Now more than ever we need to emphasize the importance of preventing cocaine use – and continue to develop new interventions.
 

Dr. Wenzinger is a clinical fellow, PGY-4, in the department of child and adolescent psychiatry at St. Louis Children’s Hospital. Dr. Gold is the 17th Distinguished Alumni Professor at the University of Florida, Gainesville, and professor of psychiatry (adjunct) at Washington University in St. Louis. He also serves as chairman of the scientific advisory boards for RiverMend Health.

 

 

References

1. Yale J Biol Med. 1988 Mar-Apr;61(2):149-55.

2. Neurosci Biobehav Rev. 1985 Fall;9(3):469-77.

3. Am J Psychiatry. 1990;147(6):719-24.

4. DEA Museum. “A New Look at Old and Not So Old Drugs: A 2018 Update on Cocaine.” Drug Enforcement Administration. Retrieved from https://deamuseum.org/lecture-series/new-look-old-not-old-drugs-2018-update-cocaine.

5. J Addict Dis. 1996;15(4):55-71.

6. The American Disease: Origins of Narcotic Control (New York: Oxford University Press, 1999).

7. Br J Clin Pharmacol. 2014 Feb;77(2):368-74.

Publications
Topics
Sections

 

Ongoing efforts to understand the impact of cocaine on the brain and on behavior have gained considerable momentum over the past few decades. It was only 30 years ago when cocaine was widely considered both safe and nonaddicting – “the champagne” of drugs.1 Progress has been steady since the cocaine-dopamine depletion theory was proposed, and ultimately supported by functional and PET imaging.2,3

Additional discoveries promise further insights into the neuroscience of addiction, pleasure, and mood. While cocaine use, abuse, and dependence might seem relatively quiescent, compared with the scourge of opiate-related deaths and addiction, it remains a public health concern – and now is the second-leading cause of drug deaths. Cocaine cultivation, smuggling, use, and the number of first-time users are all escalating.4

These developments suggest that cocaine problems might get much worse and beg an important question: Will new research give us insight into better solutions?

What the neuroscience shows

As discussed, we already know quite a bit about the neuroscience behind cocaine addiction. The positron emission tomography studies conducted by Nora D. Volkow, MD, and her associates have shown long-lasting changes in abstinent cocaine addicts. Specifically, their findings clearly demonstrated that cocaine changes the brain and depletes dopamine-rich areas. Furthermore, dopamine recovery is negligible after months of abstinence.5

However, large gaps in our understanding remain. The realm of epigenetic study and protein expression behind abuse will be key in bridging our understanding of phenotype to genotype. A recent article by Eric J. Nestler, MD, PhD, and his research team, published in the Journal of Biological Psychiatry and titled “Cocaine self-administration alters transcriptome-wide responses in the brain’s reward circuitry,” offers exciting new insights (Biol Psychiatry. 2018 Apr. doi: 10.1016/jbiopsych.2018.04.009).

The study, led by Deena M. Walker, PhD, offers perhaps the most complete illumination to date of the genetic and epigenetic changes seen in the brain after cocaine self-administration and application.

Dr. Walker and her associates used a mouse model and sorted them into one of several groups. One group examined self-administered acute cocaine exposure only, with the mice immediately harvested thereafter. Two longer-term groups included one that had cocaine exposure with prolonged (30-day) withdrawal followed by context re-exposure (context re-exposure defined as being placed back in the special chamber and lighting they first received cocaine in) and another that had a cocaine exposure with prolonged withdrawal followed by both context and cocaine re-exposure.

The researchers also ran a parallel set of control groups substituting saline for cocaine with otherwise identical durations of observation and context re-exposure. Reward-related brain regions were harvested from each subject and examined with RNA-sequencing analysis to investigate the full genomic/transcriptomic profile of each. Pairwise comparison of the various experimental groups against the control groups (for example, the theoretical baseline of genetic expression in a non–cocaine-exposed brain) uncovered telling patterns, which the investigators aptly described as a comprehensive picture of transcriptome-wide change cocaine causes within the reward circuit.

The novel and creative approach used by Dr. Walker and her associates allowed them to uncover a wealth of clinically significant findings. Much could be said about their spotlighting of specific gene/protein targets for potential future pharmacological therapies toward cocaine treatment – an area that is indeed in sore need of invention. While we have highly efficacious medications for overdose and chronic treatment of opiate abuse, the landscape of treatment options for cocaine is far bleaker and shrouded in theory. With that in mind, perhaps the most salient take-home point is the evidence that cocaine, even after one exposure/withdrawal event, causes a dramatic rewiring in the very way genes are expressed across the reward circuit. The researchers found large shifts in the patterns of genetic transcription, unique and specific to discrete regions of the examined brain tissue, such as the ventral tegmental area, ventral hippocampus, and basolateral amygdala.

More interestingly, similar patterns of these genetic alternations were observed based on the exact history of the cocaine exposure. Dr. Walker and her associates concluded that the withdrawal phase and context re-exposure appear to be crucial components in the re-sculpting of the transcriptomic profile of the reward circuity.

The brain is unprepared by evolution for the reinforcing and reorganizing effects of cocaine. Clinicians, too, have learned that cocaine is addicting and can quickly replace drives such as food, water, sex, and survival. These new data from Dr. Nestler’s team reinforce the importance of prevention. In addition, they are reminders to physicians that cocaine causes changes in brain and behavior that are persistent and not necessarily reversible. Patterns of transcriptomic change are alarming enough and have only recent begun to be fleshed out, but patterns of global substance use trends suggest that we need to begin cocaine prevention activities.
 

 

 

Is another cocaine epidemic inevitable?

The late David F. Musto, MD, who was revered as both expert medical historian and physician at Yale University, New Haven, Conn., offered perhaps the most poignant observation in this regard: He argued that almost every opiate epidemic seems to transition into a psychostimulant epidemic.6 Experts have been looking at cocaine and methamphetamine as a way to try to understand the current opioid epidemic. Indeed, the Centers for Disease Control and Prevention’s most recent report on emerging trends in cocaine use shows numerous, concerning upticks in several realms germane to a possible emerging epidemic. One of the more upstream concerns is a gigantic spike in the shear production of coca leaves and cocaine thought to be occurring in Colombia (the principal source of cocaine in the United States). Current U.S. government estimates based on seizure rates from 2016 indicate that Colombia is producing about 910 metric tons of export quality cocaine. That represents a large increase from the 670-ton estimate the year before and the 325-ton estimate the year before that.

Similarly, a sharp rise in cocaine-related deaths, an approximate 52% increase, has been charted from 2015 to 2016. This finding is likely related to the growing presence of adulterants, such as fentanyl and carfentanil, found in seized cocaine samples. However, a rise in first-time cocaine users in the past year, which, according to the National Survey on Drug Use and Health, is up by about 12% (1.1 million people) in the 2015-2016 period, shows that the danger of cocaine-related deaths might not lie solely in adulteration but also increases in use. These signals might herald a grim return of cocaine to the center stage of public health, a development that would be an encore of the crack cocaine epidemic experienced throughout the 1980s and early 1990s.

Dr. Michael L. Wenzinger

All the above findings support cocaine as an agent of swift and massive change to our reward systems that might be poised to again surge across the United States at epidemic levels. Given this insight into just how extensively it rewires brains and the unfortunate truth that direct pharmacotherapy treatments remain mostly theoretical, it is evident that the best course of action is simply to keep cocaine from ever reaching the brain in the first place. Prevention does work, and these findings underline the importance of that message. Direct psychoeducation, awareness programs, and deterrence are the best defense we can offer to our patients at this time. In addition to these tried and true techniques, fascinating new models of prevention for cocaine abuse also are in development: vaccines. Synthesized by binding cocaine to inert proteins, these vaccines are designed to prevent addiction by training the immune system to bind cocaine and thus prevent it from crossing the blood brain barrier.7 Currently approved for clinical study in humans, these might offer a game-changing new method in the prevention of substance abuse.

 

Dr. Mark S. Gold


In summary, continued research has enriched us with a deeper appreciation of just how profoundly cocaine, even after a single exposure, rewires the brain. Some people might have a cavalier attitude about drugs and even use terms such as experimentation to describe teen use, but cocaine is not cannabis. Not only initial cocaine self-administration, but also withdrawal and context of use (a bathroom, a bar table, a countertop) all serve to debase the natural transcriptome balance of the brain’s reward system. Our knowledge of what exactly contributes to the path of the cocaine addiction has grown, but options for how to treat cocaine overdose and addiction remain slim. This is particularly concerning, as history and data indicate a likelihood that a cocaine epidemic might come on the heels of the opiate epidemic. Now more than ever we need to emphasize the importance of preventing cocaine use – and continue to develop new interventions.
 

Dr. Wenzinger is a clinical fellow, PGY-4, in the department of child and adolescent psychiatry at St. Louis Children’s Hospital. Dr. Gold is the 17th Distinguished Alumni Professor at the University of Florida, Gainesville, and professor of psychiatry (adjunct) at Washington University in St. Louis. He also serves as chairman of the scientific advisory boards for RiverMend Health.

 

 

References

1. Yale J Biol Med. 1988 Mar-Apr;61(2):149-55.

2. Neurosci Biobehav Rev. 1985 Fall;9(3):469-77.

3. Am J Psychiatry. 1990;147(6):719-24.

4. DEA Museum. “A New Look at Old and Not So Old Drugs: A 2018 Update on Cocaine.” Drug Enforcement Administration. Retrieved from https://deamuseum.org/lecture-series/new-look-old-not-old-drugs-2018-update-cocaine.

5. J Addict Dis. 1996;15(4):55-71.

6. The American Disease: Origins of Narcotic Control (New York: Oxford University Press, 1999).

7. Br J Clin Pharmacol. 2014 Feb;77(2):368-74.

 

Ongoing efforts to understand the impact of cocaine on the brain and on behavior have gained considerable momentum over the past few decades. It was only 30 years ago when cocaine was widely considered both safe and nonaddicting – “the champagne” of drugs.1 Progress has been steady since the cocaine-dopamine depletion theory was proposed, and ultimately supported by functional and PET imaging.2,3

Additional discoveries promise further insights into the neuroscience of addiction, pleasure, and mood. While cocaine use, abuse, and dependence might seem relatively quiescent, compared with the scourge of opiate-related deaths and addiction, it remains a public health concern – and now is the second-leading cause of drug deaths. Cocaine cultivation, smuggling, use, and the number of first-time users are all escalating.4

These developments suggest that cocaine problems might get much worse and beg an important question: Will new research give us insight into better solutions?

What the neuroscience shows

As discussed, we already know quite a bit about the neuroscience behind cocaine addiction. The positron emission tomography studies conducted by Nora D. Volkow, MD, and her associates have shown long-lasting changes in abstinent cocaine addicts. Specifically, their findings clearly demonstrated that cocaine changes the brain and depletes dopamine-rich areas. Furthermore, dopamine recovery is negligible after months of abstinence.5

However, large gaps in our understanding remain. The realm of epigenetic study and protein expression behind abuse will be key in bridging our understanding of phenotype to genotype. A recent article by Eric J. Nestler, MD, PhD, and his research team, published in the Journal of Biological Psychiatry and titled “Cocaine self-administration alters transcriptome-wide responses in the brain’s reward circuitry,” offers exciting new insights (Biol Psychiatry. 2018 Apr. doi: 10.1016/jbiopsych.2018.04.009).

The study, led by Deena M. Walker, PhD, offers perhaps the most complete illumination to date of the genetic and epigenetic changes seen in the brain after cocaine self-administration and application.

Dr. Walker and her associates used a mouse model and sorted them into one of several groups. One group examined self-administered acute cocaine exposure only, with the mice immediately harvested thereafter. Two longer-term groups included one that had cocaine exposure with prolonged (30-day) withdrawal followed by context re-exposure (context re-exposure defined as being placed back in the special chamber and lighting they first received cocaine in) and another that had a cocaine exposure with prolonged withdrawal followed by both context and cocaine re-exposure.

The researchers also ran a parallel set of control groups substituting saline for cocaine with otherwise identical durations of observation and context re-exposure. Reward-related brain regions were harvested from each subject and examined with RNA-sequencing analysis to investigate the full genomic/transcriptomic profile of each. Pairwise comparison of the various experimental groups against the control groups (for example, the theoretical baseline of genetic expression in a non–cocaine-exposed brain) uncovered telling patterns, which the investigators aptly described as a comprehensive picture of transcriptome-wide change cocaine causes within the reward circuit.

The novel and creative approach used by Dr. Walker and her associates allowed them to uncover a wealth of clinically significant findings. Much could be said about their spotlighting of specific gene/protein targets for potential future pharmacological therapies toward cocaine treatment – an area that is indeed in sore need of invention. While we have highly efficacious medications for overdose and chronic treatment of opiate abuse, the landscape of treatment options for cocaine is far bleaker and shrouded in theory. With that in mind, perhaps the most salient take-home point is the evidence that cocaine, even after one exposure/withdrawal event, causes a dramatic rewiring in the very way genes are expressed across the reward circuit. The researchers found large shifts in the patterns of genetic transcription, unique and specific to discrete regions of the examined brain tissue, such as the ventral tegmental area, ventral hippocampus, and basolateral amygdala.

More interestingly, similar patterns of these genetic alternations were observed based on the exact history of the cocaine exposure. Dr. Walker and her associates concluded that the withdrawal phase and context re-exposure appear to be crucial components in the re-sculpting of the transcriptomic profile of the reward circuity.

The brain is unprepared by evolution for the reinforcing and reorganizing effects of cocaine. Clinicians, too, have learned that cocaine is addicting and can quickly replace drives such as food, water, sex, and survival. These new data from Dr. Nestler’s team reinforce the importance of prevention. In addition, they are reminders to physicians that cocaine causes changes in brain and behavior that are persistent and not necessarily reversible. Patterns of transcriptomic change are alarming enough and have only recent begun to be fleshed out, but patterns of global substance use trends suggest that we need to begin cocaine prevention activities.
 

 

 

Is another cocaine epidemic inevitable?

The late David F. Musto, MD, who was revered as both expert medical historian and physician at Yale University, New Haven, Conn., offered perhaps the most poignant observation in this regard: He argued that almost every opiate epidemic seems to transition into a psychostimulant epidemic.6 Experts have been looking at cocaine and methamphetamine as a way to try to understand the current opioid epidemic. Indeed, the Centers for Disease Control and Prevention’s most recent report on emerging trends in cocaine use shows numerous, concerning upticks in several realms germane to a possible emerging epidemic. One of the more upstream concerns is a gigantic spike in the shear production of coca leaves and cocaine thought to be occurring in Colombia (the principal source of cocaine in the United States). Current U.S. government estimates based on seizure rates from 2016 indicate that Colombia is producing about 910 metric tons of export quality cocaine. That represents a large increase from the 670-ton estimate the year before and the 325-ton estimate the year before that.

Similarly, a sharp rise in cocaine-related deaths, an approximate 52% increase, has been charted from 2015 to 2016. This finding is likely related to the growing presence of adulterants, such as fentanyl and carfentanil, found in seized cocaine samples. However, a rise in first-time cocaine users in the past year, which, according to the National Survey on Drug Use and Health, is up by about 12% (1.1 million people) in the 2015-2016 period, shows that the danger of cocaine-related deaths might not lie solely in adulteration but also increases in use. These signals might herald a grim return of cocaine to the center stage of public health, a development that would be an encore of the crack cocaine epidemic experienced throughout the 1980s and early 1990s.

Dr. Michael L. Wenzinger

All the above findings support cocaine as an agent of swift and massive change to our reward systems that might be poised to again surge across the United States at epidemic levels. Given this insight into just how extensively it rewires brains and the unfortunate truth that direct pharmacotherapy treatments remain mostly theoretical, it is evident that the best course of action is simply to keep cocaine from ever reaching the brain in the first place. Prevention does work, and these findings underline the importance of that message. Direct psychoeducation, awareness programs, and deterrence are the best defense we can offer to our patients at this time. In addition to these tried and true techniques, fascinating new models of prevention for cocaine abuse also are in development: vaccines. Synthesized by binding cocaine to inert proteins, these vaccines are designed to prevent addiction by training the immune system to bind cocaine and thus prevent it from crossing the blood brain barrier.7 Currently approved for clinical study in humans, these might offer a game-changing new method in the prevention of substance abuse.

 

Dr. Mark S. Gold


In summary, continued research has enriched us with a deeper appreciation of just how profoundly cocaine, even after a single exposure, rewires the brain. Some people might have a cavalier attitude about drugs and even use terms such as experimentation to describe teen use, but cocaine is not cannabis. Not only initial cocaine self-administration, but also withdrawal and context of use (a bathroom, a bar table, a countertop) all serve to debase the natural transcriptome balance of the brain’s reward system. Our knowledge of what exactly contributes to the path of the cocaine addiction has grown, but options for how to treat cocaine overdose and addiction remain slim. This is particularly concerning, as history and data indicate a likelihood that a cocaine epidemic might come on the heels of the opiate epidemic. Now more than ever we need to emphasize the importance of preventing cocaine use – and continue to develop new interventions.
 

Dr. Wenzinger is a clinical fellow, PGY-4, in the department of child and adolescent psychiatry at St. Louis Children’s Hospital. Dr. Gold is the 17th Distinguished Alumni Professor at the University of Florida, Gainesville, and professor of psychiatry (adjunct) at Washington University in St. Louis. He also serves as chairman of the scientific advisory boards for RiverMend Health.

 

 

References

1. Yale J Biol Med. 1988 Mar-Apr;61(2):149-55.

2. Neurosci Biobehav Rev. 1985 Fall;9(3):469-77.

3. Am J Psychiatry. 1990;147(6):719-24.

4. DEA Museum. “A New Look at Old and Not So Old Drugs: A 2018 Update on Cocaine.” Drug Enforcement Administration. Retrieved from https://deamuseum.org/lecture-series/new-look-old-not-old-drugs-2018-update-cocaine.

5. J Addict Dis. 1996;15(4):55-71.

6. The American Disease: Origins of Narcotic Control (New York: Oxford University Press, 1999).

7. Br J Clin Pharmacol. 2014 Feb;77(2):368-74.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica

Best Practices: Life-Threatening Allergy Prevention

Article Type
Changed
Thu, 12/06/2018 - 19:06

This supplement is sponsored by kaléo.

Click here to read the article

This supplement to Pediatric News discusses life-threatening allergy prevention, including the importance of early introduction of peanut foods to certain infants.
 
About the Author
Vivian Hernandez-Trujillo, MD
Allergy and Immunology Care Center of South Florida

 

Click here to read the article

Publications
Sections

This supplement is sponsored by kaléo.

Click here to read the article

This supplement to Pediatric News discusses life-threatening allergy prevention, including the importance of early introduction of peanut foods to certain infants.
 
About the Author
Vivian Hernandez-Trujillo, MD
Allergy and Immunology Care Center of South Florida

 

Click here to read the article

This supplement is sponsored by kaléo.

Click here to read the article

This supplement to Pediatric News discusses life-threatening allergy prevention, including the importance of early introduction of peanut foods to certain infants.
 
About the Author
Vivian Hernandez-Trujillo, MD
Allergy and Immunology Care Center of South Florida

 

Click here to read the article

Publications
Publications
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Gate On Date
Wed, 06/20/2018 - 11:15
Un-Gate On Date
Wed, 06/20/2018 - 11:15
Use ProPublica
CFC Schedule Remove Status
Wed, 06/20/2018 - 11:15

Is CLL chemoimmunotherapy dead? Not yet

Article Type
Changed
Thu, 01/12/2023 - 10:45

– Chemoimmunotherapy for chronic lymphocytic leukemia is on the way out, but there’s one scenario where it still plays a key role, according to one leukemia expert.

That scenario is not in relapsed or refractory chronic lymphocytic leukemia (CLL), where the use of fludarabine, cyclophosphamide, and rituximab (FCR) may be hard to justify today. Data supporting use of FCR in relapsed CLL show a median progression-free survival (PFS) of about 21 months, Susan M. O’Brien, MD, of the University of California, Irvine, said at the annual meeting of the American Society of Clinical Oncology. There is also data for bendamustine-rituximab retreatment showing a median event-free survival of about 15 months, she added.

By contrast, the 5-year follow-up data for the Bruton tyrosine kinase inhibitor ibrutinib in the relapsed/refractory setting shows a median PFS of 52 months, which is “extraordinary,” given that the patients had a median of four prior regimens, Dr. O’Brien said.

Similarly, recently published results from the randomized, phase 3 MURANO study of venetoclax plus rituximab in relapsed/refractory CLL showed that median PFS was not reached at a median follow-up of 23.8 months, versus a median of 17 months for the bendamustine-rituximab comparison arm (N Engl J Med. 2018;378[12]:1107-20).

“Thanks to the MURANO study, we likely will have an expanded label for venetoclax that includes the combination of venetoclax and rituximab,” Dr. O’Brien said. “I think it’s quite clear that either of these is dramatically better than what you get with retreatment with chemotherapy, so I personally don’t think there is a role for chemoimmunotherapy in the relapsed patient.”

On June 8, 2018, the Food and Drug Administration granted regular approval for venetoclax for patients with CLL or small lymphocytic lymphoma, with or without 17p deletion, who have received at least one prior therapy. The FDA also approved its use in combination with rituximab.*

But frontline CLL treatment is currently a little bit more complicated, Dr. O’Brien said.

Recent studies show favorable long-term outcomes with FCR frontline therapy in the immunoglobulin heavy chain variable gene (IgHV) –mutated subgroup of patients, she noted.

The longest follow-up comes from a study from investigators at the University of Texas MD Anderson Cancer Center, Houston, published in 2016. In that study, the 12.8-year PFS was 53.9% for IgHV-mutated patients, versus just 8.7% for patients with unmutated IgHV. Of the IgHV-mutated group, more than half achieved minimal residual disease (MRD) negativity after treatment (Blood. 2016 Jan 21; 127[3]: 303-9).

“I’m going to go out on a limb and I’m going to suggest that I think there is a cure fraction here,” Dr. O’Brien said. “On the other hand, if there’s not a cure fraction and they’re going to relapse after 17 years, that’s a pretty attractive endpoint, even if it’s not a cure fraction.”

Clinical practice guidelines now recognize IgHV mutation status as an important marker that should be obtained when deciding on treatment, Dr. O’Brien noted.

For unmutated patients, the RESONATE-2 trial showed that ibrutinib was superior to chlorambucil in older patients, many of whom had comorbid conditions. In the 3-year update, median PFS was approximately 15 months for chlorambucil, while for ibrutinib the median PFS was “nowhere near” being reached, Dr. O’Brien said.

Those data may not be so relevant for fit, unmutated patients, and two randomized trials comparing FCR with bendamustine and rituximab have yet to report data. However, one recent cross-trial comparison found fairly overlapping survival curves for the two chemoimmunotherapy approaches.

Dr. O’Brien said she would put older patients with comorbidities on ibrutinib if a clinical trial was not available, and for fit, unmutated patients, while more data are needed, she would also use ibrutinib. However, patient preference sometimes tips the scale in favor of FCR.

“The discussions sometimes are quite long about whether the patient should opt to take ibrutinib or FCR,” Dr. O’Brien said. “The last patient I had that discussion with elected to take FCR. When I asked him why, he said because he liked the idea of being finished in six cycles, off all therapy, and hopefully in remission.”

While Dr. O’Brien said she views chemoimmunotherapy as still relevant in IgHV-mutated patients, eventually it will go away, she concluded. Toward that end, there is considerable interest in venetoclax plus ibrutinib, a combination that, in early reports, has yielded very encouraging MRD results in first-line CLL.

“We have no long-term data, but very, very exciting MRD negativity data,” Dr. O’Brien said.

Dr. O’Brien reported relationships with Abbvie, Amgen, Celgene, Gilead Sciences, Janssen, Pfizer, Pharmacyclics, Sunesis Pharmaceuticals, and others.

*This story was updated 6/25/2018.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

– Chemoimmunotherapy for chronic lymphocytic leukemia is on the way out, but there’s one scenario where it still plays a key role, according to one leukemia expert.

That scenario is not in relapsed or refractory chronic lymphocytic leukemia (CLL), where the use of fludarabine, cyclophosphamide, and rituximab (FCR) may be hard to justify today. Data supporting use of FCR in relapsed CLL show a median progression-free survival (PFS) of about 21 months, Susan M. O’Brien, MD, of the University of California, Irvine, said at the annual meeting of the American Society of Clinical Oncology. There is also data for bendamustine-rituximab retreatment showing a median event-free survival of about 15 months, she added.

By contrast, the 5-year follow-up data for the Bruton tyrosine kinase inhibitor ibrutinib in the relapsed/refractory setting shows a median PFS of 52 months, which is “extraordinary,” given that the patients had a median of four prior regimens, Dr. O’Brien said.

Similarly, recently published results from the randomized, phase 3 MURANO study of venetoclax plus rituximab in relapsed/refractory CLL showed that median PFS was not reached at a median follow-up of 23.8 months, versus a median of 17 months for the bendamustine-rituximab comparison arm (N Engl J Med. 2018;378[12]:1107-20).

“Thanks to the MURANO study, we likely will have an expanded label for venetoclax that includes the combination of venetoclax and rituximab,” Dr. O’Brien said. “I think it’s quite clear that either of these is dramatically better than what you get with retreatment with chemotherapy, so I personally don’t think there is a role for chemoimmunotherapy in the relapsed patient.”

On June 8, 2018, the Food and Drug Administration granted regular approval for venetoclax for patients with CLL or small lymphocytic lymphoma, with or without 17p deletion, who have received at least one prior therapy. The FDA also approved its use in combination with rituximab.*

But frontline CLL treatment is currently a little bit more complicated, Dr. O’Brien said.

Recent studies show favorable long-term outcomes with FCR frontline therapy in the immunoglobulin heavy chain variable gene (IgHV) –mutated subgroup of patients, she noted.

The longest follow-up comes from a study from investigators at the University of Texas MD Anderson Cancer Center, Houston, published in 2016. In that study, the 12.8-year PFS was 53.9% for IgHV-mutated patients, versus just 8.7% for patients with unmutated IgHV. Of the IgHV-mutated group, more than half achieved minimal residual disease (MRD) negativity after treatment (Blood. 2016 Jan 21; 127[3]: 303-9).

“I’m going to go out on a limb and I’m going to suggest that I think there is a cure fraction here,” Dr. O’Brien said. “On the other hand, if there’s not a cure fraction and they’re going to relapse after 17 years, that’s a pretty attractive endpoint, even if it’s not a cure fraction.”

Clinical practice guidelines now recognize IgHV mutation status as an important marker that should be obtained when deciding on treatment, Dr. O’Brien noted.

For unmutated patients, the RESONATE-2 trial showed that ibrutinib was superior to chlorambucil in older patients, many of whom had comorbid conditions. In the 3-year update, median PFS was approximately 15 months for chlorambucil, while for ibrutinib the median PFS was “nowhere near” being reached, Dr. O’Brien said.

Those data may not be so relevant for fit, unmutated patients, and two randomized trials comparing FCR with bendamustine and rituximab have yet to report data. However, one recent cross-trial comparison found fairly overlapping survival curves for the two chemoimmunotherapy approaches.

Dr. O’Brien said she would put older patients with comorbidities on ibrutinib if a clinical trial was not available, and for fit, unmutated patients, while more data are needed, she would also use ibrutinib. However, patient preference sometimes tips the scale in favor of FCR.

“The discussions sometimes are quite long about whether the patient should opt to take ibrutinib or FCR,” Dr. O’Brien said. “The last patient I had that discussion with elected to take FCR. When I asked him why, he said because he liked the idea of being finished in six cycles, off all therapy, and hopefully in remission.”

While Dr. O’Brien said she views chemoimmunotherapy as still relevant in IgHV-mutated patients, eventually it will go away, she concluded. Toward that end, there is considerable interest in venetoclax plus ibrutinib, a combination that, in early reports, has yielded very encouraging MRD results in first-line CLL.

“We have no long-term data, but very, very exciting MRD negativity data,” Dr. O’Brien said.

Dr. O’Brien reported relationships with Abbvie, Amgen, Celgene, Gilead Sciences, Janssen, Pfizer, Pharmacyclics, Sunesis Pharmaceuticals, and others.

*This story was updated 6/25/2018.

– Chemoimmunotherapy for chronic lymphocytic leukemia is on the way out, but there’s one scenario where it still plays a key role, according to one leukemia expert.

That scenario is not in relapsed or refractory chronic lymphocytic leukemia (CLL), where the use of fludarabine, cyclophosphamide, and rituximab (FCR) may be hard to justify today. Data supporting use of FCR in relapsed CLL show a median progression-free survival (PFS) of about 21 months, Susan M. O’Brien, MD, of the University of California, Irvine, said at the annual meeting of the American Society of Clinical Oncology. There is also data for bendamustine-rituximab retreatment showing a median event-free survival of about 15 months, she added.

By contrast, the 5-year follow-up data for the Bruton tyrosine kinase inhibitor ibrutinib in the relapsed/refractory setting shows a median PFS of 52 months, which is “extraordinary,” given that the patients had a median of four prior regimens, Dr. O’Brien said.

Similarly, recently published results from the randomized, phase 3 MURANO study of venetoclax plus rituximab in relapsed/refractory CLL showed that median PFS was not reached at a median follow-up of 23.8 months, versus a median of 17 months for the bendamustine-rituximab comparison arm (N Engl J Med. 2018;378[12]:1107-20).

“Thanks to the MURANO study, we likely will have an expanded label for venetoclax that includes the combination of venetoclax and rituximab,” Dr. O’Brien said. “I think it’s quite clear that either of these is dramatically better than what you get with retreatment with chemotherapy, so I personally don’t think there is a role for chemoimmunotherapy in the relapsed patient.”

On June 8, 2018, the Food and Drug Administration granted regular approval for venetoclax for patients with CLL or small lymphocytic lymphoma, with or without 17p deletion, who have received at least one prior therapy. The FDA also approved its use in combination with rituximab.*

But frontline CLL treatment is currently a little bit more complicated, Dr. O’Brien said.

Recent studies show favorable long-term outcomes with FCR frontline therapy in the immunoglobulin heavy chain variable gene (IgHV) –mutated subgroup of patients, she noted.

The longest follow-up comes from a study from investigators at the University of Texas MD Anderson Cancer Center, Houston, published in 2016. In that study, the 12.8-year PFS was 53.9% for IgHV-mutated patients, versus just 8.7% for patients with unmutated IgHV. Of the IgHV-mutated group, more than half achieved minimal residual disease (MRD) negativity after treatment (Blood. 2016 Jan 21; 127[3]: 303-9).

“I’m going to go out on a limb and I’m going to suggest that I think there is a cure fraction here,” Dr. O’Brien said. “On the other hand, if there’s not a cure fraction and they’re going to relapse after 17 years, that’s a pretty attractive endpoint, even if it’s not a cure fraction.”

Clinical practice guidelines now recognize IgHV mutation status as an important marker that should be obtained when deciding on treatment, Dr. O’Brien noted.

For unmutated patients, the RESONATE-2 trial showed that ibrutinib was superior to chlorambucil in older patients, many of whom had comorbid conditions. In the 3-year update, median PFS was approximately 15 months for chlorambucil, while for ibrutinib the median PFS was “nowhere near” being reached, Dr. O’Brien said.

Those data may not be so relevant for fit, unmutated patients, and two randomized trials comparing FCR with bendamustine and rituximab have yet to report data. However, one recent cross-trial comparison found fairly overlapping survival curves for the two chemoimmunotherapy approaches.

Dr. O’Brien said she would put older patients with comorbidities on ibrutinib if a clinical trial was not available, and for fit, unmutated patients, while more data are needed, she would also use ibrutinib. However, patient preference sometimes tips the scale in favor of FCR.

“The discussions sometimes are quite long about whether the patient should opt to take ibrutinib or FCR,” Dr. O’Brien said. “The last patient I had that discussion with elected to take FCR. When I asked him why, he said because he liked the idea of being finished in six cycles, off all therapy, and hopefully in remission.”

While Dr. O’Brien said she views chemoimmunotherapy as still relevant in IgHV-mutated patients, eventually it will go away, she concluded. Toward that end, there is considerable interest in venetoclax plus ibrutinib, a combination that, in early reports, has yielded very encouraging MRD results in first-line CLL.

“We have no long-term data, but very, very exciting MRD negativity data,” Dr. O’Brien said.

Dr. O’Brien reported relationships with Abbvie, Amgen, Celgene, Gilead Sciences, Janssen, Pfizer, Pharmacyclics, Sunesis Pharmaceuticals, and others.

*This story was updated 6/25/2018.

Publications
Publications
Topics
Article Type
Sections
Article Source

EXPERT ANALYSIS FROM ASCO 2018

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica