User login
Antibiotic Failure: Not Only a Hospital Phenomenon
The public tends to think of antibiotic resistance as a problem that largely affects patients in hospitals, say researchers from Cardiff University, University of Oxford, and Pharmatelligence, all in the United Kingdom (UK); and Abbott Healthcare Products in the Netherlands. Unfortunately, they note, so do many primary care practitioners, even though recent antibiotic use in primary care is the single most important risk factor for an infection with a resistant organism.
As the researchers’ study of 22 years of primary care prescribing in the UK makes clear, antibiotic resistance is a primary care problem, too. During that time, > 1 in 10 of the initial antibiotic monotherapies they studied failed.
Using data on 58 million antibiotic prescriptions from the Clinical Practice Research Datalink, a database derived from nearly 700 primary care practices in the UK, the researchers analyzed almost 11 million first-time monotherapy episodes for 4 indications: upper respiratory tract infections (URTIs), lower respiratory tract infections, skin and soft tissue infections, and acute otitis media. Of all antibiotic prescriptions, 98% were monotherapy.
Over time, the proportion of infections treated with antibiotics changed. The greatest increase was in the smallest class, acute otitis media, which rose from 63% in 1991 to 83% in 2012. The proportion of URTIs treated with antibiotics dropped from 59% in 1991 to 55% in 2012.
The most commonly prescribed antibiotics were amoxicillin (42% of infections), followed by phenoxymethylpenicillin (penicillin-V) (95% for URTIs) and flucloxacillin (97% for skin and soft tissue infections).
The treatment failure rate rose from 13.9% in 1991 to 15.4% in 2012, with some “notably high levels of failure,” the researchers say. They cite trimethoprim’s overall failure rate of 37% (increasing from 24.7% in 1991 to 55.9% in 2012) when used to treat URTIs. Failure rates for cephalosporins also increased “markedly.” By contrast, failure rates for macrolides across the 4 infection classes remained largely stable. In 2012, the antibiotics with the lowest failure rates were penicillin-V for URTIs, and lymecycline and oxytetracycline for skin and soft tissue infections.
The rise in antibiotic failures was less prominent for the most frequently prescribed antibiotics and those recommended for first-line treatment, such as amoxicillin, clarithromycin, and erythromycin. The more striking increases were seen in antibiotics not usually recommended as first-line treatments for the infection classes in the study, such as cephalosporins. Those drugs, however, might have been prescribed for more severely ill and frail patients who had recently been prescribed a first-line drug or who were already resistant to a drug, the researchers say.
Most of the increase in failures dated from 2000, the researchers say, when community antibiotic prescribing, which had been falling in the late 1990s, plateaued, and then once again began rising.
Their findings could represent a phenomenon that will resolve, or might be an “early indication of a more dramatic and worrying process,” the researchers caution. The finding that 1 in 10 initial antibiotic treatments in primary care fails represents a “considerable burden” on patients and the health care system. They suggest that primary care physicians can play a central role in helping to contain rises in antibiotic treatment failures by managing patient expectations and carefully considering whether each prescription is justified.
Source
Currie CJ, Berni E, Jenkins-Jones S, et al. BMJ. 2014;349:g5493.
doi: 10.1136/bmj.g5493.
The public tends to think of antibiotic resistance as a problem that largely affects patients in hospitals, say researchers from Cardiff University, University of Oxford, and Pharmatelligence, all in the United Kingdom (UK); and Abbott Healthcare Products in the Netherlands. Unfortunately, they note, so do many primary care practitioners, even though recent antibiotic use in primary care is the single most important risk factor for an infection with a resistant organism.
As the researchers’ study of 22 years of primary care prescribing in the UK makes clear, antibiotic resistance is a primary care problem, too. During that time, > 1 in 10 of the initial antibiotic monotherapies they studied failed.
Using data on 58 million antibiotic prescriptions from the Clinical Practice Research Datalink, a database derived from nearly 700 primary care practices in the UK, the researchers analyzed almost 11 million first-time monotherapy episodes for 4 indications: upper respiratory tract infections (URTIs), lower respiratory tract infections, skin and soft tissue infections, and acute otitis media. Of all antibiotic prescriptions, 98% were monotherapy.
Over time, the proportion of infections treated with antibiotics changed. The greatest increase was in the smallest class, acute otitis media, which rose from 63% in 1991 to 83% in 2012. The proportion of URTIs treated with antibiotics dropped from 59% in 1991 to 55% in 2012.
The most commonly prescribed antibiotics were amoxicillin (42% of infections), followed by phenoxymethylpenicillin (penicillin-V) (95% for URTIs) and flucloxacillin (97% for skin and soft tissue infections).
The treatment failure rate rose from 13.9% in 1991 to 15.4% in 2012, with some “notably high levels of failure,” the researchers say. They cite trimethoprim’s overall failure rate of 37% (increasing from 24.7% in 1991 to 55.9% in 2012) when used to treat URTIs. Failure rates for cephalosporins also increased “markedly.” By contrast, failure rates for macrolides across the 4 infection classes remained largely stable. In 2012, the antibiotics with the lowest failure rates were penicillin-V for URTIs, and lymecycline and oxytetracycline for skin and soft tissue infections.
The rise in antibiotic failures was less prominent for the most frequently prescribed antibiotics and those recommended for first-line treatment, such as amoxicillin, clarithromycin, and erythromycin. The more striking increases were seen in antibiotics not usually recommended as first-line treatments for the infection classes in the study, such as cephalosporins. Those drugs, however, might have been prescribed for more severely ill and frail patients who had recently been prescribed a first-line drug or who were already resistant to a drug, the researchers say.
Most of the increase in failures dated from 2000, the researchers say, when community antibiotic prescribing, which had been falling in the late 1990s, plateaued, and then once again began rising.
Their findings could represent a phenomenon that will resolve, or might be an “early indication of a more dramatic and worrying process,” the researchers caution. The finding that 1 in 10 initial antibiotic treatments in primary care fails represents a “considerable burden” on patients and the health care system. They suggest that primary care physicians can play a central role in helping to contain rises in antibiotic treatment failures by managing patient expectations and carefully considering whether each prescription is justified.
Source
Currie CJ, Berni E, Jenkins-Jones S, et al. BMJ. 2014;349:g5493.
doi: 10.1136/bmj.g5493.
The public tends to think of antibiotic resistance as a problem that largely affects patients in hospitals, say researchers from Cardiff University, University of Oxford, and Pharmatelligence, all in the United Kingdom (UK); and Abbott Healthcare Products in the Netherlands. Unfortunately, they note, so do many primary care practitioners, even though recent antibiotic use in primary care is the single most important risk factor for an infection with a resistant organism.
As the researchers’ study of 22 years of primary care prescribing in the UK makes clear, antibiotic resistance is a primary care problem, too. During that time, > 1 in 10 of the initial antibiotic monotherapies they studied failed.
Using data on 58 million antibiotic prescriptions from the Clinical Practice Research Datalink, a database derived from nearly 700 primary care practices in the UK, the researchers analyzed almost 11 million first-time monotherapy episodes for 4 indications: upper respiratory tract infections (URTIs), lower respiratory tract infections, skin and soft tissue infections, and acute otitis media. Of all antibiotic prescriptions, 98% were monotherapy.
Over time, the proportion of infections treated with antibiotics changed. The greatest increase was in the smallest class, acute otitis media, which rose from 63% in 1991 to 83% in 2012. The proportion of URTIs treated with antibiotics dropped from 59% in 1991 to 55% in 2012.
The most commonly prescribed antibiotics were amoxicillin (42% of infections), followed by phenoxymethylpenicillin (penicillin-V) (95% for URTIs) and flucloxacillin (97% for skin and soft tissue infections).
The treatment failure rate rose from 13.9% in 1991 to 15.4% in 2012, with some “notably high levels of failure,” the researchers say. They cite trimethoprim’s overall failure rate of 37% (increasing from 24.7% in 1991 to 55.9% in 2012) when used to treat URTIs. Failure rates for cephalosporins also increased “markedly.” By contrast, failure rates for macrolides across the 4 infection classes remained largely stable. In 2012, the antibiotics with the lowest failure rates were penicillin-V for URTIs, and lymecycline and oxytetracycline for skin and soft tissue infections.
The rise in antibiotic failures was less prominent for the most frequently prescribed antibiotics and those recommended for first-line treatment, such as amoxicillin, clarithromycin, and erythromycin. The more striking increases were seen in antibiotics not usually recommended as first-line treatments for the infection classes in the study, such as cephalosporins. Those drugs, however, might have been prescribed for more severely ill and frail patients who had recently been prescribed a first-line drug or who were already resistant to a drug, the researchers say.
Most of the increase in failures dated from 2000, the researchers say, when community antibiotic prescribing, which had been falling in the late 1990s, plateaued, and then once again began rising.
Their findings could represent a phenomenon that will resolve, or might be an “early indication of a more dramatic and worrying process,” the researchers caution. The finding that 1 in 10 initial antibiotic treatments in primary care fails represents a “considerable burden” on patients and the health care system. They suggest that primary care physicians can play a central role in helping to contain rises in antibiotic treatment failures by managing patient expectations and carefully considering whether each prescription is justified.
Source
Currie CJ, Berni E, Jenkins-Jones S, et al. BMJ. 2014;349:g5493.
doi: 10.1136/bmj.g5493.
Reducing Candida-Related Shock With Empiric Treatment
The microbial cause of infection is often not known at the time antibiotics are prescribed for patients in Candida-related septic shock, but delaying therapy has been associated with a mortality rate of > 90%. Researchers from St. Louis College of Pharmacy, Barnes-Jewish Hospital, BJC HealthCare, and Washington University, all in St. Louis, Missouri, conducted a pilot study that found empiric antifungal treatment could shorten the time to administration of appropriate treatment for Candida-related septic shock.
The Barnes-Jewish Hospital intensive care unit (ICU) averages 1,400 admissions per year, the researchers say, with a 10% prevalence of Candida as the cause of septic shock. They add that the rate of resistance to fluconazole in all species of Candida combined is about 15%. In this before-after study, 15 patients who presented before December 31, 2012, were in the standard-care group. They received antibiotics, including antifungal drugs, at the discretion of the treating physician. The remaining 13 (treated after January 1, 2013) received empiric therapy with micafungin 100 mg/d or fluconazole 800 mg IV on day 1, followed by 400 mg/d IV. The choice of antifungal agent was left to the ICU team and clinical pharmacist but was partly based on whether the patient had any prior exposure to fluconazole, in which case micafungin was prescribed.
Sixteen patients received appropriate antifungal therapy. The remaining 12 patients received delayed antifungal therapy, 1 of which received no antifungal therapy before death.
The mean time from onset of shock to appropriate therapy was statistically shorter in the empiric therapy group (10.6 hours vs 40.5 hours). The mean time from culture collection to appropriate therapy was also statistically shorter in the empiric therapy group (13.7 hours vs 43.3 hours in the standard care group; P = .001). Patients who received empiric therapy were more likely to have received appropriate therapy within 12 hours (69.2% vs 6.7%) and within 24 hours (76.9% vs 40%).
The shorter time to appropriate treatment meant a slight but noticeable difference in survival. Twelve patients died during hospitalization, but those who received appropriate therapy within 24 hours of onset of hypotension had greater hospital survival rates: 68.8% vs 41.7%.
Source
Micek ST, Arnold H, Juang P, et al. Clin Ther. 2014;36(9):1226-1232.
doi: 10.1016/j.clinthera.2014.06.28.
The microbial cause of infection is often not known at the time antibiotics are prescribed for patients in Candida-related septic shock, but delaying therapy has been associated with a mortality rate of > 90%. Researchers from St. Louis College of Pharmacy, Barnes-Jewish Hospital, BJC HealthCare, and Washington University, all in St. Louis, Missouri, conducted a pilot study that found empiric antifungal treatment could shorten the time to administration of appropriate treatment for Candida-related septic shock.
The Barnes-Jewish Hospital intensive care unit (ICU) averages 1,400 admissions per year, the researchers say, with a 10% prevalence of Candida as the cause of septic shock. They add that the rate of resistance to fluconazole in all species of Candida combined is about 15%. In this before-after study, 15 patients who presented before December 31, 2012, were in the standard-care group. They received antibiotics, including antifungal drugs, at the discretion of the treating physician. The remaining 13 (treated after January 1, 2013) received empiric therapy with micafungin 100 mg/d or fluconazole 800 mg IV on day 1, followed by 400 mg/d IV. The choice of antifungal agent was left to the ICU team and clinical pharmacist but was partly based on whether the patient had any prior exposure to fluconazole, in which case micafungin was prescribed.
Sixteen patients received appropriate antifungal therapy. The remaining 12 patients received delayed antifungal therapy, 1 of which received no antifungal therapy before death.
The mean time from onset of shock to appropriate therapy was statistically shorter in the empiric therapy group (10.6 hours vs 40.5 hours). The mean time from culture collection to appropriate therapy was also statistically shorter in the empiric therapy group (13.7 hours vs 43.3 hours in the standard care group; P = .001). Patients who received empiric therapy were more likely to have received appropriate therapy within 12 hours (69.2% vs 6.7%) and within 24 hours (76.9% vs 40%).
The shorter time to appropriate treatment meant a slight but noticeable difference in survival. Twelve patients died during hospitalization, but those who received appropriate therapy within 24 hours of onset of hypotension had greater hospital survival rates: 68.8% vs 41.7%.
Source
Micek ST, Arnold H, Juang P, et al. Clin Ther. 2014;36(9):1226-1232.
doi: 10.1016/j.clinthera.2014.06.28.
The microbial cause of infection is often not known at the time antibiotics are prescribed for patients in Candida-related septic shock, but delaying therapy has been associated with a mortality rate of > 90%. Researchers from St. Louis College of Pharmacy, Barnes-Jewish Hospital, BJC HealthCare, and Washington University, all in St. Louis, Missouri, conducted a pilot study that found empiric antifungal treatment could shorten the time to administration of appropriate treatment for Candida-related septic shock.
The Barnes-Jewish Hospital intensive care unit (ICU) averages 1,400 admissions per year, the researchers say, with a 10% prevalence of Candida as the cause of septic shock. They add that the rate of resistance to fluconazole in all species of Candida combined is about 15%. In this before-after study, 15 patients who presented before December 31, 2012, were in the standard-care group. They received antibiotics, including antifungal drugs, at the discretion of the treating physician. The remaining 13 (treated after January 1, 2013) received empiric therapy with micafungin 100 mg/d or fluconazole 800 mg IV on day 1, followed by 400 mg/d IV. The choice of antifungal agent was left to the ICU team and clinical pharmacist but was partly based on whether the patient had any prior exposure to fluconazole, in which case micafungin was prescribed.
Sixteen patients received appropriate antifungal therapy. The remaining 12 patients received delayed antifungal therapy, 1 of which received no antifungal therapy before death.
The mean time from onset of shock to appropriate therapy was statistically shorter in the empiric therapy group (10.6 hours vs 40.5 hours). The mean time from culture collection to appropriate therapy was also statistically shorter in the empiric therapy group (13.7 hours vs 43.3 hours in the standard care group; P = .001). Patients who received empiric therapy were more likely to have received appropriate therapy within 12 hours (69.2% vs 6.7%) and within 24 hours (76.9% vs 40%).
The shorter time to appropriate treatment meant a slight but noticeable difference in survival. Twelve patients died during hospitalization, but those who received appropriate therapy within 24 hours of onset of hypotension had greater hospital survival rates: 68.8% vs 41.7%.
Source
Micek ST, Arnold H, Juang P, et al. Clin Ther. 2014;36(9):1226-1232.
doi: 10.1016/j.clinthera.2014.06.28.
Long-Acting Insulin Analogs: Effects on Diabetic Retinopathy
Long-acting insulin analogs are designed to enhance glycemic control without excessively lowering blood glucose. But structural modifications of the insulin molecule can alter biological responses and binding characteristics with specific receptors; in short, they can potentially raise the risk of sight-threatening diabetic retinopathy (STDR), say researchers from Taipei City Hospital and National Taiwan University, both in Taiwan.
The researchers note that some clinical trials have reported that intensification of endogenous insulin might accelerate progression of pre-existing STDR. However, they add that some studies used cancer cell lines, and insulin was administered at supraphysiologic concentrations.
The researchers conducted a retrospective study to evaluate the effects of long-acting insulin analogs (glargine and/or detemir) with neutral protamine Hagedorn (NPH) insulin on the progression of STDR in 46,739 patients with type 2 diabetesmellitus (T2DM).
They found no changed risk of STDR with the long-acting insulin analogs, between either matched or unmatched cohorts. For instance, with a median follow-up of 483 days, they found 479 events with glargine initiators in 8,947 patients. There were 541 events in a median of 541 days’ follow-up for 8,947 patients in the NPH initiators group. The detemir group, with 411 days of follow-up, had 64 events.
Despite a “relatively short” observation period, the researchers say their findings agree with those of a previous open-label randomized study of patients with T2DM, which found treatment with insulin glargine over 5 years did not increase progression of STDR, compared with NPH insulin treatment.
Source
Lin JC, Shau WY, Lai MS. Clin Ther. 2014;36(9):1255-1268.
doi: 10.1016/j.clinthera.2014.06.031.3.
Long-acting insulin analogs are designed to enhance glycemic control without excessively lowering blood glucose. But structural modifications of the insulin molecule can alter biological responses and binding characteristics with specific receptors; in short, they can potentially raise the risk of sight-threatening diabetic retinopathy (STDR), say researchers from Taipei City Hospital and National Taiwan University, both in Taiwan.
The researchers note that some clinical trials have reported that intensification of endogenous insulin might accelerate progression of pre-existing STDR. However, they add that some studies used cancer cell lines, and insulin was administered at supraphysiologic concentrations.
The researchers conducted a retrospective study to evaluate the effects of long-acting insulin analogs (glargine and/or detemir) with neutral protamine Hagedorn (NPH) insulin on the progression of STDR in 46,739 patients with type 2 diabetesmellitus (T2DM).
They found no changed risk of STDR with the long-acting insulin analogs, between either matched or unmatched cohorts. For instance, with a median follow-up of 483 days, they found 479 events with glargine initiators in 8,947 patients. There were 541 events in a median of 541 days’ follow-up for 8,947 patients in the NPH initiators group. The detemir group, with 411 days of follow-up, had 64 events.
Despite a “relatively short” observation period, the researchers say their findings agree with those of a previous open-label randomized study of patients with T2DM, which found treatment with insulin glargine over 5 years did not increase progression of STDR, compared with NPH insulin treatment.
Source
Lin JC, Shau WY, Lai MS. Clin Ther. 2014;36(9):1255-1268.
doi: 10.1016/j.clinthera.2014.06.031.3.
Long-acting insulin analogs are designed to enhance glycemic control without excessively lowering blood glucose. But structural modifications of the insulin molecule can alter biological responses and binding characteristics with specific receptors; in short, they can potentially raise the risk of sight-threatening diabetic retinopathy (STDR), say researchers from Taipei City Hospital and National Taiwan University, both in Taiwan.
The researchers note that some clinical trials have reported that intensification of endogenous insulin might accelerate progression of pre-existing STDR. However, they add that some studies used cancer cell lines, and insulin was administered at supraphysiologic concentrations.
The researchers conducted a retrospective study to evaluate the effects of long-acting insulin analogs (glargine and/or detemir) with neutral protamine Hagedorn (NPH) insulin on the progression of STDR in 46,739 patients with type 2 diabetesmellitus (T2DM).
They found no changed risk of STDR with the long-acting insulin analogs, between either matched or unmatched cohorts. For instance, with a median follow-up of 483 days, they found 479 events with glargine initiators in 8,947 patients. There were 541 events in a median of 541 days’ follow-up for 8,947 patients in the NPH initiators group. The detemir group, with 411 days of follow-up, had 64 events.
Despite a “relatively short” observation period, the researchers say their findings agree with those of a previous open-label randomized study of patients with T2DM, which found treatment with insulin glargine over 5 years did not increase progression of STDR, compared with NPH insulin treatment.
Source
Lin JC, Shau WY, Lai MS. Clin Ther. 2014;36(9):1255-1268.
doi: 10.1016/j.clinthera.2014.06.031.3.
Referring Smokers to Quitlines
Telephone-based programs to encourage and support quitting smoking—known as “quitlines”—have been established as effective both clinically and in the “real world.” However, quitlines rely on smokers calling in for them to work. Researchers from the Tobacco Cessation and Prevention Program and the Massachusetts Department of Public Health, both in Boston, say being referred to quitlines by their health care providers might help even more smokers quit.
The researchers collected data for 2,737 provider-referred and 530 self-referred smokers from the Massachusetts Smokers’ Helpline, which offers evidence-based proactive telephone counseling sessions, nicotine replacement therapy (NRT), and self-help materials. They then examined differences in demographics, service utilization, and quit outcomes. The primary outcome was quit status at the 6- to 8-month follow-up.
Twenty percent of provider-referred clients quit smoking; 26% of self-referred clients quit. Provider-referred smokers who used the quitline services had higher odds of quitting, compared with those who used methods not including a quitline. However, the provider-referral model was limited by lower enrollment, lower use, and poorer outcomes compared with self-referred services. Patients aged 18 to 34 years and those aged > 65 years had the best chances of quitting and staying smoke-free. Patients referred from private practices were more likely to quit than were those referred from a hospital.
Provider-referred smokers who read the self-help materials had 1.2 times the odds of quitting compared with those who didn’t. Using any amount of the 2-week supply of NRT doubled the odds of quitting. Clients who used a combination of counseling and NRT had the greatest success, with more than triple the chances of quitting, compared with clients who did neither.
Self-referred smokers also had good results, although not usually as quickly as provider-referred smokers, who saw significantly improved outcomes after 3 counseling sessions. It took 4 sessions for the self-referred smokers to see the same results.
Readiness to quit was an important predictor of success. Tobacco dependence also predicted quitting success. Clients in both groups who could wait > 30 minutes before smoking their first cigarette had better luck than did those who had to smoke immediately after waking.
The researchers suggest some ways to improve the provider-referred model. One is to offer more support for smokers from lower socioeconomic levels. According to other research, those smokers may need more treatment content and access to more intensive pharmacotherapy, the researchers say.
It might also help all quitline clients, they add, to minimize the wait time between referral and actual provision of service. Longer wait times can “increase the room for ambivalence for any smoker,” they note, “but may be especially detrimental for provider-referred smokers who are less ready to make a quit attempt.”
And what is the provider’s role in assessing patient readiness to quit and preparing patients for the quitline services? In reality, the researchers say, not all patients have received an evidence-based intervention or are ready to quit when they are enrolled. Moreover, many are lost in the callback process. Using 3 callback attempts, the Massachusetts quitline only reaches 40% of their potential quitters. Upping the callback rate to 5 attempts raises the reach slightly, to 50%. That shows the need, the researchers say, for better and more frequent provider training, outreach, feedback reporting, clinical champions on site, and systems support.
Source
Song G, Landau AS, Gorin TJ, Keithly L. Am J Prev Med. 2014;47(4):392-402.
doi: 10.1016/j.amepre.2014.05.043.
Telephone-based programs to encourage and support quitting smoking—known as “quitlines”—have been established as effective both clinically and in the “real world.” However, quitlines rely on smokers calling in for them to work. Researchers from the Tobacco Cessation and Prevention Program and the Massachusetts Department of Public Health, both in Boston, say being referred to quitlines by their health care providers might help even more smokers quit.
The researchers collected data for 2,737 provider-referred and 530 self-referred smokers from the Massachusetts Smokers’ Helpline, which offers evidence-based proactive telephone counseling sessions, nicotine replacement therapy (NRT), and self-help materials. They then examined differences in demographics, service utilization, and quit outcomes. The primary outcome was quit status at the 6- to 8-month follow-up.
Twenty percent of provider-referred clients quit smoking; 26% of self-referred clients quit. Provider-referred smokers who used the quitline services had higher odds of quitting, compared with those who used methods not including a quitline. However, the provider-referral model was limited by lower enrollment, lower use, and poorer outcomes compared with self-referred services. Patients aged 18 to 34 years and those aged > 65 years had the best chances of quitting and staying smoke-free. Patients referred from private practices were more likely to quit than were those referred from a hospital.
Provider-referred smokers who read the self-help materials had 1.2 times the odds of quitting compared with those who didn’t. Using any amount of the 2-week supply of NRT doubled the odds of quitting. Clients who used a combination of counseling and NRT had the greatest success, with more than triple the chances of quitting, compared with clients who did neither.
Self-referred smokers also had good results, although not usually as quickly as provider-referred smokers, who saw significantly improved outcomes after 3 counseling sessions. It took 4 sessions for the self-referred smokers to see the same results.
Readiness to quit was an important predictor of success. Tobacco dependence also predicted quitting success. Clients in both groups who could wait > 30 minutes before smoking their first cigarette had better luck than did those who had to smoke immediately after waking.
The researchers suggest some ways to improve the provider-referred model. One is to offer more support for smokers from lower socioeconomic levels. According to other research, those smokers may need more treatment content and access to more intensive pharmacotherapy, the researchers say.
It might also help all quitline clients, they add, to minimize the wait time between referral and actual provision of service. Longer wait times can “increase the room for ambivalence for any smoker,” they note, “but may be especially detrimental for provider-referred smokers who are less ready to make a quit attempt.”
And what is the provider’s role in assessing patient readiness to quit and preparing patients for the quitline services? In reality, the researchers say, not all patients have received an evidence-based intervention or are ready to quit when they are enrolled. Moreover, many are lost in the callback process. Using 3 callback attempts, the Massachusetts quitline only reaches 40% of their potential quitters. Upping the callback rate to 5 attempts raises the reach slightly, to 50%. That shows the need, the researchers say, for better and more frequent provider training, outreach, feedback reporting, clinical champions on site, and systems support.
Source
Song G, Landau AS, Gorin TJ, Keithly L. Am J Prev Med. 2014;47(4):392-402.
doi: 10.1016/j.amepre.2014.05.043.
Telephone-based programs to encourage and support quitting smoking—known as “quitlines”—have been established as effective both clinically and in the “real world.” However, quitlines rely on smokers calling in for them to work. Researchers from the Tobacco Cessation and Prevention Program and the Massachusetts Department of Public Health, both in Boston, say being referred to quitlines by their health care providers might help even more smokers quit.
The researchers collected data for 2,737 provider-referred and 530 self-referred smokers from the Massachusetts Smokers’ Helpline, which offers evidence-based proactive telephone counseling sessions, nicotine replacement therapy (NRT), and self-help materials. They then examined differences in demographics, service utilization, and quit outcomes. The primary outcome was quit status at the 6- to 8-month follow-up.
Twenty percent of provider-referred clients quit smoking; 26% of self-referred clients quit. Provider-referred smokers who used the quitline services had higher odds of quitting, compared with those who used methods not including a quitline. However, the provider-referral model was limited by lower enrollment, lower use, and poorer outcomes compared with self-referred services. Patients aged 18 to 34 years and those aged > 65 years had the best chances of quitting and staying smoke-free. Patients referred from private practices were more likely to quit than were those referred from a hospital.
Provider-referred smokers who read the self-help materials had 1.2 times the odds of quitting compared with those who didn’t. Using any amount of the 2-week supply of NRT doubled the odds of quitting. Clients who used a combination of counseling and NRT had the greatest success, with more than triple the chances of quitting, compared with clients who did neither.
Self-referred smokers also had good results, although not usually as quickly as provider-referred smokers, who saw significantly improved outcomes after 3 counseling sessions. It took 4 sessions for the self-referred smokers to see the same results.
Readiness to quit was an important predictor of success. Tobacco dependence also predicted quitting success. Clients in both groups who could wait > 30 minutes before smoking their first cigarette had better luck than did those who had to smoke immediately after waking.
The researchers suggest some ways to improve the provider-referred model. One is to offer more support for smokers from lower socioeconomic levels. According to other research, those smokers may need more treatment content and access to more intensive pharmacotherapy, the researchers say.
It might also help all quitline clients, they add, to minimize the wait time between referral and actual provision of service. Longer wait times can “increase the room for ambivalence for any smoker,” they note, “but may be especially detrimental for provider-referred smokers who are less ready to make a quit attempt.”
And what is the provider’s role in assessing patient readiness to quit and preparing patients for the quitline services? In reality, the researchers say, not all patients have received an evidence-based intervention or are ready to quit when they are enrolled. Moreover, many are lost in the callback process. Using 3 callback attempts, the Massachusetts quitline only reaches 40% of their potential quitters. Upping the callback rate to 5 attempts raises the reach slightly, to 50%. That shows the need, the researchers say, for better and more frequent provider training, outreach, feedback reporting, clinical champions on site, and systems support.
Source
Song G, Landau AS, Gorin TJ, Keithly L. Am J Prev Med. 2014;47(4):392-402.
doi: 10.1016/j.amepre.2014.05.043.
The Best Times to Try Abiraterone
Until recently, there have been few treatment options for advanced prostate cancer that is resistant to androgen-directed therapies. Newer treatments that target residual androgen production offer some hope of prolonging the interval before chemotherapy, with fewer adverse effects (AEs) and better efficacy. One of those is abiraterone, which blocks extragonadal, testicular, and tumor androgen biosynthesis.
An ongoing multinational phase 3 study is evaluating the clinical benefits of abiraterone plus prednisone vs prednisone alone in patients with progressive metastatic castration-resistant prostate cancer (mCRPC). Follow-up for the study has now exceeded 27 months, giving a good opportunity to evaluate safety and efficacy. Thus, after having reviewed outcomes so far, the independent data-monitoring committee recommended that the study be unblinded and patients be allowed to cross over from prednisone to abiraterone. The researchers reported the results of the third interim analysis, with updated analysis.
Patients were stratified by Eastern Cooperative Oncology Group performance status (ECOG-PS) and randomly assigned to receive abiraterone 1,000 mg plus prednisone 5 mg twice daily or placebo plus prednisone.
Patients who received abiraterone had, compared with those on prednisone, statistically significant improvement in radiographic progression-free survival (PFS), with a median time to disease progression or death of 16.5 months, vs 8.2 months (95% CI, 0.45-0.61).
Overall survival also lengthened, from a median of 35.3 months vs 30.1 months (95% CI, 0.66-0.95).
All secondary endpoints also favored abiraterone over prednisone. For instance, abiraterone treatment delayed the time to the need for opiates for cancer-related pain and the time to initiation of chemotherapy. Abiraterone also delayed the time to deterioration in ECOG-PS and prostate-specific antigen (PSA) progression. Abiraterone more than doubled the PSA response rate: 68% vs 29% with prednisone.
Patients reported more pain relief. Those receiving abiraterone had statistically significant improvement in pain interference (P = .005), although the improvement in mean pain intensity was not significant.
Adverse effects leading to dose modifications or interruption of treatment were reported in 21% of patients on abiraterone, compared with 12% of the prednisone group. Six patients (1%) in each group died of drug-related treatment-emergent AEs. The AEs of “special interest,” the researchers say, included events related to mineralocorticoid excess, such as hypertension, hypokalemia, and fluid retention—all unsurprising, given the known mechanism of action of abiraterone. Grade 3 or 4 AEs with increased alanine aminotransferase and aspartate aminotransferase were more common in the abiraterone group.
The most common subsequent therapy for patients who terminated the study was docetaxel. However, another recent study, from Johns Hopkins researchers in Baltimore, Maryland, indicates the transition warrants caution: The findings suggest a potential cross-resistance between docetaxel and abiraterone.
Their study compared outcomes in 24 men who received abiraterone before docetaxel with 95 who were abiraterone-naïve. Men who were on abiraterone were less likely to achieve a PSA response, and their cancer was more likely to progress.
The researchers concede that their study groups were small; they also say it is possible that differences in disease severity may have influenced the time to progression. However, they say the fact that PSA-PFS was significantly different between the 2 groups (P = .002) supports their initial hypothesis—that is, that abiraterone pretreatment reduces responsiveness to docetaxel.
In spite of its limitations, the researchers say their study represents the only comparative analysis of PSA-PFS and PFS after docetaxel treatment for patients who have or have not received prior abiraterone. Their report, they add, offers the “strongest available evidence to date” of a clinically meaningful cross-resistance between abiraterone and docetaxel. They conclude that their findings provide “valuable information” about which patients are likely to derive the most benefit from docetaxel.
Sources
Rathkopf DE, Smith MR, de Bono JS, et al. Eur Urol. 2014;66(5):815-825.
doi: 10.1016/j.eururo.2014.02.056.
Schewizer MT, Zhou XC, Wang H, et al. Eur Urol. 2014;66(4):646-652.
doi: 10.1016/j.eururo.2014.01.018.
Until recently, there have been few treatment options for advanced prostate cancer that is resistant to androgen-directed therapies. Newer treatments that target residual androgen production offer some hope of prolonging the interval before chemotherapy, with fewer adverse effects (AEs) and better efficacy. One of those is abiraterone, which blocks extragonadal, testicular, and tumor androgen biosynthesis.
An ongoing multinational phase 3 study is evaluating the clinical benefits of abiraterone plus prednisone vs prednisone alone in patients with progressive metastatic castration-resistant prostate cancer (mCRPC). Follow-up for the study has now exceeded 27 months, giving a good opportunity to evaluate safety and efficacy. Thus, after having reviewed outcomes so far, the independent data-monitoring committee recommended that the study be unblinded and patients be allowed to cross over from prednisone to abiraterone. The researchers reported the results of the third interim analysis, with updated analysis.
Patients were stratified by Eastern Cooperative Oncology Group performance status (ECOG-PS) and randomly assigned to receive abiraterone 1,000 mg plus prednisone 5 mg twice daily or placebo plus prednisone.
Patients who received abiraterone had, compared with those on prednisone, statistically significant improvement in radiographic progression-free survival (PFS), with a median time to disease progression or death of 16.5 months, vs 8.2 months (95% CI, 0.45-0.61).
Overall survival also lengthened, from a median of 35.3 months vs 30.1 months (95% CI, 0.66-0.95).
All secondary endpoints also favored abiraterone over prednisone. For instance, abiraterone treatment delayed the time to the need for opiates for cancer-related pain and the time to initiation of chemotherapy. Abiraterone also delayed the time to deterioration in ECOG-PS and prostate-specific antigen (PSA) progression. Abiraterone more than doubled the PSA response rate: 68% vs 29% with prednisone.
Patients reported more pain relief. Those receiving abiraterone had statistically significant improvement in pain interference (P = .005), although the improvement in mean pain intensity was not significant.
Adverse effects leading to dose modifications or interruption of treatment were reported in 21% of patients on abiraterone, compared with 12% of the prednisone group. Six patients (1%) in each group died of drug-related treatment-emergent AEs. The AEs of “special interest,” the researchers say, included events related to mineralocorticoid excess, such as hypertension, hypokalemia, and fluid retention—all unsurprising, given the known mechanism of action of abiraterone. Grade 3 or 4 AEs with increased alanine aminotransferase and aspartate aminotransferase were more common in the abiraterone group.
The most common subsequent therapy for patients who terminated the study was docetaxel. However, another recent study, from Johns Hopkins researchers in Baltimore, Maryland, indicates the transition warrants caution: The findings suggest a potential cross-resistance between docetaxel and abiraterone.
Their study compared outcomes in 24 men who received abiraterone before docetaxel with 95 who were abiraterone-naïve. Men who were on abiraterone were less likely to achieve a PSA response, and their cancer was more likely to progress.
The researchers concede that their study groups were small; they also say it is possible that differences in disease severity may have influenced the time to progression. However, they say the fact that PSA-PFS was significantly different between the 2 groups (P = .002) supports their initial hypothesis—that is, that abiraterone pretreatment reduces responsiveness to docetaxel.
In spite of its limitations, the researchers say their study represents the only comparative analysis of PSA-PFS and PFS after docetaxel treatment for patients who have or have not received prior abiraterone. Their report, they add, offers the “strongest available evidence to date” of a clinically meaningful cross-resistance between abiraterone and docetaxel. They conclude that their findings provide “valuable information” about which patients are likely to derive the most benefit from docetaxel.
Sources
Rathkopf DE, Smith MR, de Bono JS, et al. Eur Urol. 2014;66(5):815-825.
doi: 10.1016/j.eururo.2014.02.056.
Schewizer MT, Zhou XC, Wang H, et al. Eur Urol. 2014;66(4):646-652.
doi: 10.1016/j.eururo.2014.01.018.
Until recently, there have been few treatment options for advanced prostate cancer that is resistant to androgen-directed therapies. Newer treatments that target residual androgen production offer some hope of prolonging the interval before chemotherapy, with fewer adverse effects (AEs) and better efficacy. One of those is abiraterone, which blocks extragonadal, testicular, and tumor androgen biosynthesis.
An ongoing multinational phase 3 study is evaluating the clinical benefits of abiraterone plus prednisone vs prednisone alone in patients with progressive metastatic castration-resistant prostate cancer (mCRPC). Follow-up for the study has now exceeded 27 months, giving a good opportunity to evaluate safety and efficacy. Thus, after having reviewed outcomes so far, the independent data-monitoring committee recommended that the study be unblinded and patients be allowed to cross over from prednisone to abiraterone. The researchers reported the results of the third interim analysis, with updated analysis.
Patients were stratified by Eastern Cooperative Oncology Group performance status (ECOG-PS) and randomly assigned to receive abiraterone 1,000 mg plus prednisone 5 mg twice daily or placebo plus prednisone.
Patients who received abiraterone had, compared with those on prednisone, statistically significant improvement in radiographic progression-free survival (PFS), with a median time to disease progression or death of 16.5 months, vs 8.2 months (95% CI, 0.45-0.61).
Overall survival also lengthened, from a median of 35.3 months vs 30.1 months (95% CI, 0.66-0.95).
All secondary endpoints also favored abiraterone over prednisone. For instance, abiraterone treatment delayed the time to the need for opiates for cancer-related pain and the time to initiation of chemotherapy. Abiraterone also delayed the time to deterioration in ECOG-PS and prostate-specific antigen (PSA) progression. Abiraterone more than doubled the PSA response rate: 68% vs 29% with prednisone.
Patients reported more pain relief. Those receiving abiraterone had statistically significant improvement in pain interference (P = .005), although the improvement in mean pain intensity was not significant.
Adverse effects leading to dose modifications or interruption of treatment were reported in 21% of patients on abiraterone, compared with 12% of the prednisone group. Six patients (1%) in each group died of drug-related treatment-emergent AEs. The AEs of “special interest,” the researchers say, included events related to mineralocorticoid excess, such as hypertension, hypokalemia, and fluid retention—all unsurprising, given the known mechanism of action of abiraterone. Grade 3 or 4 AEs with increased alanine aminotransferase and aspartate aminotransferase were more common in the abiraterone group.
The most common subsequent therapy for patients who terminated the study was docetaxel. However, another recent study, from Johns Hopkins researchers in Baltimore, Maryland, indicates the transition warrants caution: The findings suggest a potential cross-resistance between docetaxel and abiraterone.
Their study compared outcomes in 24 men who received abiraterone before docetaxel with 95 who were abiraterone-naïve. Men who were on abiraterone were less likely to achieve a PSA response, and their cancer was more likely to progress.
The researchers concede that their study groups were small; they also say it is possible that differences in disease severity may have influenced the time to progression. However, they say the fact that PSA-PFS was significantly different between the 2 groups (P = .002) supports their initial hypothesis—that is, that abiraterone pretreatment reduces responsiveness to docetaxel.
In spite of its limitations, the researchers say their study represents the only comparative analysis of PSA-PFS and PFS after docetaxel treatment for patients who have or have not received prior abiraterone. Their report, they add, offers the “strongest available evidence to date” of a clinically meaningful cross-resistance between abiraterone and docetaxel. They conclude that their findings provide “valuable information” about which patients are likely to derive the most benefit from docetaxel.
Sources
Rathkopf DE, Smith MR, de Bono JS, et al. Eur Urol. 2014;66(5):815-825.
doi: 10.1016/j.eururo.2014.02.056.
Schewizer MT, Zhou XC, Wang H, et al. Eur Urol. 2014;66(4):646-652.
doi: 10.1016/j.eururo.2014.01.018.
NSAIDs Linked to Poor Pneumonia Outcomes
Nonsteroidal anti-inflammatory drugs (NSAIDs) given in the early stages of lower respiratory tract infection could be helping send younger adults to the intensive care unit (ICU) with serious pneumonia, say researchers from Hôpital Louis Mourier, Colombes, and Université Paris Diderot, both in France. Their concerns were triggered in part by witnessing several cases of unexpectedly severe forms of Streptococcus pneumoniae (S pneumoniae) community-acquired pneumonia (CAP) in healthy adults.
They analyzed data on 106 patients admitted with pneumococcal pneumonia or S pneumoniae and pneumonia as the discharge diagnosis. Twenty patients had received NSAIDs within 4 days prior to admission. The patients given NSAIDs were younger than those who were not prescribed NSAIDs (aged 43 years on average vs aged 62 years on average), usually working, and less likely to have comorbidities. The mean duration of NSAID treatment was 4 days. The time to the first medical consultation after pneumonia-related symptoms appeared was the same in both groups, but the patients on NSAIDs were prescribed antibiotics significantly later than those not taking NSAIDs (4.5 days vs 2 days, P = .001). They were also admitted to the ICU later.
A “noticeable and significant difference” was that more patients in the NSAID group had pleural effusion (P < .0006). New onset of pleuropulmonary complications during the ICU stay was significantly greater in the group who had received NSAIDs than in the no-NSAID group (P = .0008).
The researchers say their findings “highlight the overlooked risk of taking NSAIDs to treat physical symptoms at an early stage of CAP.” They hypothesize that patient age and comorbidity status led physicians to not diagnose CAP, and thus withhold antibiotics. NSAIDs may blunt general signs and symptoms and mask the severity of the infectious process, the researchers caution. Thus, they recommend ensuring appropriate antibiotic coverage along with NSAIDs.
In a survey of French general practitioners’ prescriptions, NSAIDs were prescribed for almost half of all patients seen for lower respiratory tract infection, “despite the fact that this prescription never appears in any national or international guideline,” the researchers say. That underscores the need to better inform general practitioners about the risks of NSAIDs, they say.
Source
Messika J, Sztrymf B, Bertrand F, et al. J Crit Care. 2014;29(5):733-738.
doi: 10.1016/j.jcrc.2014.05.021.
Nonsteroidal anti-inflammatory drugs (NSAIDs) given in the early stages of lower respiratory tract infection could be helping send younger adults to the intensive care unit (ICU) with serious pneumonia, say researchers from Hôpital Louis Mourier, Colombes, and Université Paris Diderot, both in France. Their concerns were triggered in part by witnessing several cases of unexpectedly severe forms of Streptococcus pneumoniae (S pneumoniae) community-acquired pneumonia (CAP) in healthy adults.
They analyzed data on 106 patients admitted with pneumococcal pneumonia or S pneumoniae and pneumonia as the discharge diagnosis. Twenty patients had received NSAIDs within 4 days prior to admission. The patients given NSAIDs were younger than those who were not prescribed NSAIDs (aged 43 years on average vs aged 62 years on average), usually working, and less likely to have comorbidities. The mean duration of NSAID treatment was 4 days. The time to the first medical consultation after pneumonia-related symptoms appeared was the same in both groups, but the patients on NSAIDs were prescribed antibiotics significantly later than those not taking NSAIDs (4.5 days vs 2 days, P = .001). They were also admitted to the ICU later.
A “noticeable and significant difference” was that more patients in the NSAID group had pleural effusion (P < .0006). New onset of pleuropulmonary complications during the ICU stay was significantly greater in the group who had received NSAIDs than in the no-NSAID group (P = .0008).
The researchers say their findings “highlight the overlooked risk of taking NSAIDs to treat physical symptoms at an early stage of CAP.” They hypothesize that patient age and comorbidity status led physicians to not diagnose CAP, and thus withhold antibiotics. NSAIDs may blunt general signs and symptoms and mask the severity of the infectious process, the researchers caution. Thus, they recommend ensuring appropriate antibiotic coverage along with NSAIDs.
In a survey of French general practitioners’ prescriptions, NSAIDs were prescribed for almost half of all patients seen for lower respiratory tract infection, “despite the fact that this prescription never appears in any national or international guideline,” the researchers say. That underscores the need to better inform general practitioners about the risks of NSAIDs, they say.
Source
Messika J, Sztrymf B, Bertrand F, et al. J Crit Care. 2014;29(5):733-738.
doi: 10.1016/j.jcrc.2014.05.021.
Nonsteroidal anti-inflammatory drugs (NSAIDs) given in the early stages of lower respiratory tract infection could be helping send younger adults to the intensive care unit (ICU) with serious pneumonia, say researchers from Hôpital Louis Mourier, Colombes, and Université Paris Diderot, both in France. Their concerns were triggered in part by witnessing several cases of unexpectedly severe forms of Streptococcus pneumoniae (S pneumoniae) community-acquired pneumonia (CAP) in healthy adults.
They analyzed data on 106 patients admitted with pneumococcal pneumonia or S pneumoniae and pneumonia as the discharge diagnosis. Twenty patients had received NSAIDs within 4 days prior to admission. The patients given NSAIDs were younger than those who were not prescribed NSAIDs (aged 43 years on average vs aged 62 years on average), usually working, and less likely to have comorbidities. The mean duration of NSAID treatment was 4 days. The time to the first medical consultation after pneumonia-related symptoms appeared was the same in both groups, but the patients on NSAIDs were prescribed antibiotics significantly later than those not taking NSAIDs (4.5 days vs 2 days, P = .001). They were also admitted to the ICU later.
A “noticeable and significant difference” was that more patients in the NSAID group had pleural effusion (P < .0006). New onset of pleuropulmonary complications during the ICU stay was significantly greater in the group who had received NSAIDs than in the no-NSAID group (P = .0008).
The researchers say their findings “highlight the overlooked risk of taking NSAIDs to treat physical symptoms at an early stage of CAP.” They hypothesize that patient age and comorbidity status led physicians to not diagnose CAP, and thus withhold antibiotics. NSAIDs may blunt general signs and symptoms and mask the severity of the infectious process, the researchers caution. Thus, they recommend ensuring appropriate antibiotic coverage along with NSAIDs.
In a survey of French general practitioners’ prescriptions, NSAIDs were prescribed for almost half of all patients seen for lower respiratory tract infection, “despite the fact that this prescription never appears in any national or international guideline,” the researchers say. That underscores the need to better inform general practitioners about the risks of NSAIDs, they say.
Source
Messika J, Sztrymf B, Bertrand F, et al. J Crit Care. 2014;29(5):733-738.
doi: 10.1016/j.jcrc.2014.05.021.
Do Benzodiazepines Increase Dementia Risk?
Benzodiazepines are regularly used to treat anxiety, insomnia, and depression. Guidelines advise only short-term benzodiazepine use for elderly patients, but long-term treatment is still common. According to researchers from the University of Montreal in Canada and the University of Bordeaux in France, long-term dosing can do more harm than good for patients at risk for Alzheimer disease. Previous research has established that long-term benzodiazepine use can have deleterious effects on memory and cognition, say the researchers.
Earlier studies could not prove a connection between the drugs and dementia, because they did not have sufficient power, follow-up was too short, or because of other methodologic limitations. To counter those earlier limitations, the researchers designed their study to assess benzodiazepine treatments initiated > 5 years before the diagnosis of Alzheimer disease or dementia, when prescriptions were less likely to be motivated by prodromes.
The researchers used an administrative claims database with a long follow-up period to look at the potential dose-effect relationship. They defined exposure by 3 criteria: “ever use” (≥ 1 claim for a benzodiazepine from 5 to 10 years before the index date); cumulative dose (≤ 3 months, 3 to 6 months, or > 6 months [long-term use]); and drug elimination half-life (short- [< 20 hours] or long-acting benzodiazepines). The researchers matched 1,796 patients with 7,184 controls and followed them for ≥ 6 years before the index date.
The risk of Alzheimer disease, the study revealed, increased by 43% to 51% among people who had used benzodiazepines in the past: 894 (49.8%) people with Alzheimer disease had used benzodiazepines at some point, compared with 2,873 controls (40%). Short-term use did not differ between the 2 groups. Long-term use was more common among people with Alzheimer disease, the researchers found: 32.9% of those with Alzheimer disease compared with 21.8% of those in the control group.
Risk of Alzheimer disease increased when long-acting benzodiazepines were used. Because there is no prevention or cure for Alzheimer disease, the researchers urge focusing on duration of benzodiazepine use and other modifiable risk factors.
Source
Billioti de Gage S, Moride Y, Ducruet T, et al. BMJ. 2014;349:g5205.
doi: 10.1136/bmj.g5205.
Benzodiazepines are regularly used to treat anxiety, insomnia, and depression. Guidelines advise only short-term benzodiazepine use for elderly patients, but long-term treatment is still common. According to researchers from the University of Montreal in Canada and the University of Bordeaux in France, long-term dosing can do more harm than good for patients at risk for Alzheimer disease. Previous research has established that long-term benzodiazepine use can have deleterious effects on memory and cognition, say the researchers.
Earlier studies could not prove a connection between the drugs and dementia, because they did not have sufficient power, follow-up was too short, or because of other methodologic limitations. To counter those earlier limitations, the researchers designed their study to assess benzodiazepine treatments initiated > 5 years before the diagnosis of Alzheimer disease or dementia, when prescriptions were less likely to be motivated by prodromes.
The researchers used an administrative claims database with a long follow-up period to look at the potential dose-effect relationship. They defined exposure by 3 criteria: “ever use” (≥ 1 claim for a benzodiazepine from 5 to 10 years before the index date); cumulative dose (≤ 3 months, 3 to 6 months, or > 6 months [long-term use]); and drug elimination half-life (short- [< 20 hours] or long-acting benzodiazepines). The researchers matched 1,796 patients with 7,184 controls and followed them for ≥ 6 years before the index date.
The risk of Alzheimer disease, the study revealed, increased by 43% to 51% among people who had used benzodiazepines in the past: 894 (49.8%) people with Alzheimer disease had used benzodiazepines at some point, compared with 2,873 controls (40%). Short-term use did not differ between the 2 groups. Long-term use was more common among people with Alzheimer disease, the researchers found: 32.9% of those with Alzheimer disease compared with 21.8% of those in the control group.
Risk of Alzheimer disease increased when long-acting benzodiazepines were used. Because there is no prevention or cure for Alzheimer disease, the researchers urge focusing on duration of benzodiazepine use and other modifiable risk factors.
Source
Billioti de Gage S, Moride Y, Ducruet T, et al. BMJ. 2014;349:g5205.
doi: 10.1136/bmj.g5205.
Benzodiazepines are regularly used to treat anxiety, insomnia, and depression. Guidelines advise only short-term benzodiazepine use for elderly patients, but long-term treatment is still common. According to researchers from the University of Montreal in Canada and the University of Bordeaux in France, long-term dosing can do more harm than good for patients at risk for Alzheimer disease. Previous research has established that long-term benzodiazepine use can have deleterious effects on memory and cognition, say the researchers.
Earlier studies could not prove a connection between the drugs and dementia, because they did not have sufficient power, follow-up was too short, or because of other methodologic limitations. To counter those earlier limitations, the researchers designed their study to assess benzodiazepine treatments initiated > 5 years before the diagnosis of Alzheimer disease or dementia, when prescriptions were less likely to be motivated by prodromes.
The researchers used an administrative claims database with a long follow-up period to look at the potential dose-effect relationship. They defined exposure by 3 criteria: “ever use” (≥ 1 claim for a benzodiazepine from 5 to 10 years before the index date); cumulative dose (≤ 3 months, 3 to 6 months, or > 6 months [long-term use]); and drug elimination half-life (short- [< 20 hours] or long-acting benzodiazepines). The researchers matched 1,796 patients with 7,184 controls and followed them for ≥ 6 years before the index date.
The risk of Alzheimer disease, the study revealed, increased by 43% to 51% among people who had used benzodiazepines in the past: 894 (49.8%) people with Alzheimer disease had used benzodiazepines at some point, compared with 2,873 controls (40%). Short-term use did not differ between the 2 groups. Long-term use was more common among people with Alzheimer disease, the researchers found: 32.9% of those with Alzheimer disease compared with 21.8% of those in the control group.
Risk of Alzheimer disease increased when long-acting benzodiazepines were used. Because there is no prevention or cure for Alzheimer disease, the researchers urge focusing on duration of benzodiazepine use and other modifiable risk factors.
Source
Billioti de Gage S, Moride Y, Ducruet T, et al. BMJ. 2014;349:g5205.
doi: 10.1136/bmj.g5205.
Probiotics for Radiation-Caused Diarrhea
Probiotics may help reduce the severity of one of the most common acute adverse effects of radiation—diarrhea. They just might not show results right away. According to a study by researchers from the Centre Hospitalier Universitaire de Québec in Canada, the effects of probiotics were positive and most effective in the weeks after radiation treatment.
No prophylactic agents are approved for preventing pelvic radiation enteritis, the researchers note, and evidence is weak for nutritional interventions that have been tried. But recently, more research is pointing to a powerful role for probiotics in a variety of gastrointestinal uses.
This study compared the efficacy of the probiotic Bifilact (Lactobacillus acidophilus LAC-361 and Bifidobacterium longum BB-536) with placebo. The main goal was to find whether probiotics would prevent or delay the incidence of moderate-to-severe radiation-induced diarrhea during a radiotherapy treatment. Secondary goals were to assess whether Bifilact reduced or delayed the need of antidiarrheal medication, reduced intestinal pain, reduced the need for hospitalization, minimized interruptions to radiotherapy treatments, and improved patient well-being.
In the study, 229 patients received either placebo or 1 of 2 Bifilact regimens: a standard dose twice a day or a high dose 3 times a day. Patients recorded their digestive symptoms every day and met with a registered dietitian and radiation oncologist every week during treatment.
Although the differences were not statistically significant, probiotics eventually halved the proportion of patients with moderate-to-severe diarrhea. The mean number of bowel movements per 24 hours was 2.3 in the placebo group, 2.3 in the standard-dose, and 2.1 in the high-dose patients (P = .84). During the treatment, those numbers changed to 2.9, 2.7, and 2.8 bowel movements per day (P = .80), respectively. For patients who underwent surgical procedures before radiation, probiotic intake tended to reduce all levels of diarrhea, especially the most severe grade 4 diarrhea. Median abdominal pain was initially 0 for all groups and < 1 during treatment for all groups (P = .23).
To their knowledge, the researchers say, only 6 human clinical studies have been published regarding using probiotics to prevent acute radiation-induced diarrhea, and only 1 for treating diarrhea after radiotherapy. The 6 studies showed positive results on diarrhea toxicity and/or frequency of bowel movement and/or stool consistency. Two of 3 systematic reviews also found a probable beneficial effect on prevention.
Their own study produced some interesting findings, the researchers say. For one, less diarrhea at 60 days meant the benefit began at the end of the treatment or after it. Patients with a standard dose of probiotic experienced less of the moderate-to-severe diarrhea at the end of treatment or during the 2 weeks following treatment. And after 60 days, 35% of patients in the standard-dose group did not have moderate-to-severe diarrhea, compared with 17% in the placebo group (P = .23). The researchers say the probiotic effect may be delayed because of the time required by bacteria to exert their influence on the inflammatory process.
Source
Demers M, Dagnault A, Desjardins J. Clin Nutr. 2014;33(5):761-767.
doi: 10.1016/j.clnu.2013.10.015.
Probiotics may help reduce the severity of one of the most common acute adverse effects of radiation—diarrhea. They just might not show results right away. According to a study by researchers from the Centre Hospitalier Universitaire de Québec in Canada, the effects of probiotics were positive and most effective in the weeks after radiation treatment.
No prophylactic agents are approved for preventing pelvic radiation enteritis, the researchers note, and evidence is weak for nutritional interventions that have been tried. But recently, more research is pointing to a powerful role for probiotics in a variety of gastrointestinal uses.
This study compared the efficacy of the probiotic Bifilact (Lactobacillus acidophilus LAC-361 and Bifidobacterium longum BB-536) with placebo. The main goal was to find whether probiotics would prevent or delay the incidence of moderate-to-severe radiation-induced diarrhea during a radiotherapy treatment. Secondary goals were to assess whether Bifilact reduced or delayed the need of antidiarrheal medication, reduced intestinal pain, reduced the need for hospitalization, minimized interruptions to radiotherapy treatments, and improved patient well-being.
In the study, 229 patients received either placebo or 1 of 2 Bifilact regimens: a standard dose twice a day or a high dose 3 times a day. Patients recorded their digestive symptoms every day and met with a registered dietitian and radiation oncologist every week during treatment.
Although the differences were not statistically significant, probiotics eventually halved the proportion of patients with moderate-to-severe diarrhea. The mean number of bowel movements per 24 hours was 2.3 in the placebo group, 2.3 in the standard-dose, and 2.1 in the high-dose patients (P = .84). During the treatment, those numbers changed to 2.9, 2.7, and 2.8 bowel movements per day (P = .80), respectively. For patients who underwent surgical procedures before radiation, probiotic intake tended to reduce all levels of diarrhea, especially the most severe grade 4 diarrhea. Median abdominal pain was initially 0 for all groups and < 1 during treatment for all groups (P = .23).
To their knowledge, the researchers say, only 6 human clinical studies have been published regarding using probiotics to prevent acute radiation-induced diarrhea, and only 1 for treating diarrhea after radiotherapy. The 6 studies showed positive results on diarrhea toxicity and/or frequency of bowel movement and/or stool consistency. Two of 3 systematic reviews also found a probable beneficial effect on prevention.
Their own study produced some interesting findings, the researchers say. For one, less diarrhea at 60 days meant the benefit began at the end of the treatment or after it. Patients with a standard dose of probiotic experienced less of the moderate-to-severe diarrhea at the end of treatment or during the 2 weeks following treatment. And after 60 days, 35% of patients in the standard-dose group did not have moderate-to-severe diarrhea, compared with 17% in the placebo group (P = .23). The researchers say the probiotic effect may be delayed because of the time required by bacteria to exert their influence on the inflammatory process.
Source
Demers M, Dagnault A, Desjardins J. Clin Nutr. 2014;33(5):761-767.
doi: 10.1016/j.clnu.2013.10.015.
Probiotics may help reduce the severity of one of the most common acute adverse effects of radiation—diarrhea. They just might not show results right away. According to a study by researchers from the Centre Hospitalier Universitaire de Québec in Canada, the effects of probiotics were positive and most effective in the weeks after radiation treatment.
No prophylactic agents are approved for preventing pelvic radiation enteritis, the researchers note, and evidence is weak for nutritional interventions that have been tried. But recently, more research is pointing to a powerful role for probiotics in a variety of gastrointestinal uses.
This study compared the efficacy of the probiotic Bifilact (Lactobacillus acidophilus LAC-361 and Bifidobacterium longum BB-536) with placebo. The main goal was to find whether probiotics would prevent or delay the incidence of moderate-to-severe radiation-induced diarrhea during a radiotherapy treatment. Secondary goals were to assess whether Bifilact reduced or delayed the need of antidiarrheal medication, reduced intestinal pain, reduced the need for hospitalization, minimized interruptions to radiotherapy treatments, and improved patient well-being.
In the study, 229 patients received either placebo or 1 of 2 Bifilact regimens: a standard dose twice a day or a high dose 3 times a day. Patients recorded their digestive symptoms every day and met with a registered dietitian and radiation oncologist every week during treatment.
Although the differences were not statistically significant, probiotics eventually halved the proportion of patients with moderate-to-severe diarrhea. The mean number of bowel movements per 24 hours was 2.3 in the placebo group, 2.3 in the standard-dose, and 2.1 in the high-dose patients (P = .84). During the treatment, those numbers changed to 2.9, 2.7, and 2.8 bowel movements per day (P = .80), respectively. For patients who underwent surgical procedures before radiation, probiotic intake tended to reduce all levels of diarrhea, especially the most severe grade 4 diarrhea. Median abdominal pain was initially 0 for all groups and < 1 during treatment for all groups (P = .23).
To their knowledge, the researchers say, only 6 human clinical studies have been published regarding using probiotics to prevent acute radiation-induced diarrhea, and only 1 for treating diarrhea after radiotherapy. The 6 studies showed positive results on diarrhea toxicity and/or frequency of bowel movement and/or stool consistency. Two of 3 systematic reviews also found a probable beneficial effect on prevention.
Their own study produced some interesting findings, the researchers say. For one, less diarrhea at 60 days meant the benefit began at the end of the treatment or after it. Patients with a standard dose of probiotic experienced less of the moderate-to-severe diarrhea at the end of treatment or during the 2 weeks following treatment. And after 60 days, 35% of patients in the standard-dose group did not have moderate-to-severe diarrhea, compared with 17% in the placebo group (P = .23). The researchers say the probiotic effect may be delayed because of the time required by bacteria to exert their influence on the inflammatory process.
Source
Demers M, Dagnault A, Desjardins J. Clin Nutr. 2014;33(5):761-767.
doi: 10.1016/j.clnu.2013.10.015.
Banning Smoking in Tribal Casinos
In American Indian (AI) communities that have casinos, health is improving, thanks to more employment and less poverty. But it is not all good news because casino workers (of whom 1 in 4 is an AI) and patrons are still exposed to secondhand smoke in many casinos, say researchers from the Great Lakes Inter-Tribal Council (GLITC) in Lac du Flambeau (LDF), Wisconsin; Northwest Portland Area Indian Health Board, in Oregon; and the University of Oklahoma in Tulsa. Tribal casinos are exempt from statewide bans on smoking because of tribal sovereignty. However, though smoking has declined in other segments of the population, it has not among AIs, where the incidence is 40%—more than twice that of the general U.S. population. Moreover, the tobacco industry has increasingly targeted tribal casinos, the researchers add.
Only 6 of 237 tribes operating casinos have voluntarily implemented casino-wide smoking bans. The tribal community is aiming to do something about that, though. The GLITC, a consortium of 12 member tribes in Wisconsin and Upper Michigan, and the LDF tribal nation, a member of GLITC, collaborated with the Lake of the Torches Resort Casino in northern Wisconsin to survey casino patrons to find out whether a smoking ban could get passed.
The project team’s analysis was based on survey responses from 957 casino patrons who were questioned about their opinions on smoking, secondhand smoke, and smoking bans. Most respondents were white and nonsmokers. A majority (69%) were bothered to some extent by smoke in the casino, and 81% believed secondhand smoke is harmful. Those who preferred a smoke-free casino were older, white, and gambled less. Over half (54%) said they were likely to visit more often, 28% said they were indifferent to a smoke-free casino, and 18% said they would visit less if the casino were smoke free.
The researchers cite other studies that have found that only 20% of casino patrons smoke. They also say smoking bans are not cited as reasons people visit casinos less, and smoking bans do not result in revenue loss for casinos.
This is the first study to employ a community-based and tribally led approach. The access would not have been possible without the “significant trust” between GLITC and the LDF tribal nation, the researchers say. The casino, owned and operated by the tribal nation whose members indirectly benefit from casino revenue, was responsive to community concerns about secondhand smoke, they add. This suggests that tribal communities may be “uniquely suited…to play a leadership role in a smoke-free casino movement.”
Source
Brokenleg IS, Barber TK, Bennett NL, Peart Boyce S, Blue Bird Jernigan V. Am J Prev Med. 2014;47(3):290-299.
doi: 10.1016/j.amepre.2014.04.006.
In American Indian (AI) communities that have casinos, health is improving, thanks to more employment and less poverty. But it is not all good news because casino workers (of whom 1 in 4 is an AI) and patrons are still exposed to secondhand smoke in many casinos, say researchers from the Great Lakes Inter-Tribal Council (GLITC) in Lac du Flambeau (LDF), Wisconsin; Northwest Portland Area Indian Health Board, in Oregon; and the University of Oklahoma in Tulsa. Tribal casinos are exempt from statewide bans on smoking because of tribal sovereignty. However, though smoking has declined in other segments of the population, it has not among AIs, where the incidence is 40%—more than twice that of the general U.S. population. Moreover, the tobacco industry has increasingly targeted tribal casinos, the researchers add.
Only 6 of 237 tribes operating casinos have voluntarily implemented casino-wide smoking bans. The tribal community is aiming to do something about that, though. The GLITC, a consortium of 12 member tribes in Wisconsin and Upper Michigan, and the LDF tribal nation, a member of GLITC, collaborated with the Lake of the Torches Resort Casino in northern Wisconsin to survey casino patrons to find out whether a smoking ban could get passed.
The project team’s analysis was based on survey responses from 957 casino patrons who were questioned about their opinions on smoking, secondhand smoke, and smoking bans. Most respondents were white and nonsmokers. A majority (69%) were bothered to some extent by smoke in the casino, and 81% believed secondhand smoke is harmful. Those who preferred a smoke-free casino were older, white, and gambled less. Over half (54%) said they were likely to visit more often, 28% said they were indifferent to a smoke-free casino, and 18% said they would visit less if the casino were smoke free.
The researchers cite other studies that have found that only 20% of casino patrons smoke. They also say smoking bans are not cited as reasons people visit casinos less, and smoking bans do not result in revenue loss for casinos.
This is the first study to employ a community-based and tribally led approach. The access would not have been possible without the “significant trust” between GLITC and the LDF tribal nation, the researchers say. The casino, owned and operated by the tribal nation whose members indirectly benefit from casino revenue, was responsive to community concerns about secondhand smoke, they add. This suggests that tribal communities may be “uniquely suited…to play a leadership role in a smoke-free casino movement.”
Source
Brokenleg IS, Barber TK, Bennett NL, Peart Boyce S, Blue Bird Jernigan V. Am J Prev Med. 2014;47(3):290-299.
doi: 10.1016/j.amepre.2014.04.006.
In American Indian (AI) communities that have casinos, health is improving, thanks to more employment and less poverty. But it is not all good news because casino workers (of whom 1 in 4 is an AI) and patrons are still exposed to secondhand smoke in many casinos, say researchers from the Great Lakes Inter-Tribal Council (GLITC) in Lac du Flambeau (LDF), Wisconsin; Northwest Portland Area Indian Health Board, in Oregon; and the University of Oklahoma in Tulsa. Tribal casinos are exempt from statewide bans on smoking because of tribal sovereignty. However, though smoking has declined in other segments of the population, it has not among AIs, where the incidence is 40%—more than twice that of the general U.S. population. Moreover, the tobacco industry has increasingly targeted tribal casinos, the researchers add.
Only 6 of 237 tribes operating casinos have voluntarily implemented casino-wide smoking bans. The tribal community is aiming to do something about that, though. The GLITC, a consortium of 12 member tribes in Wisconsin and Upper Michigan, and the LDF tribal nation, a member of GLITC, collaborated with the Lake of the Torches Resort Casino in northern Wisconsin to survey casino patrons to find out whether a smoking ban could get passed.
The project team’s analysis was based on survey responses from 957 casino patrons who were questioned about their opinions on smoking, secondhand smoke, and smoking bans. Most respondents were white and nonsmokers. A majority (69%) were bothered to some extent by smoke in the casino, and 81% believed secondhand smoke is harmful. Those who preferred a smoke-free casino were older, white, and gambled less. Over half (54%) said they were likely to visit more often, 28% said they were indifferent to a smoke-free casino, and 18% said they would visit less if the casino were smoke free.
The researchers cite other studies that have found that only 20% of casino patrons smoke. They also say smoking bans are not cited as reasons people visit casinos less, and smoking bans do not result in revenue loss for casinos.
This is the first study to employ a community-based and tribally led approach. The access would not have been possible without the “significant trust” between GLITC and the LDF tribal nation, the researchers say. The casino, owned and operated by the tribal nation whose members indirectly benefit from casino revenue, was responsive to community concerns about secondhand smoke, they add. This suggests that tribal communities may be “uniquely suited…to play a leadership role in a smoke-free casino movement.”
Source
Brokenleg IS, Barber TK, Bennett NL, Peart Boyce S, Blue Bird Jernigan V. Am J Prev Med. 2014;47(3):290-299.
doi: 10.1016/j.amepre.2014.04.006.
Methotrexate: Finding the Right Starting Dose
Decades of experience have narrowed the most effective dose of methotrexate (MTX) for rheumatoid arthritis to somewhere between 15 mg and 25 mg per week. However, experience has also suggested that early and rapid control of the disease activity minimizes damage. The result has been a quicker escalation of MTX dosing, with a starting dose of 10 mg to 15 mg per week and escalating by 5 mg every month, rather than the more traditional 5 mg every 3 months.
But researchers from Post Graduate Institute of Medical Education and Research in Chandigarh, India, point out that the recommendation to start with the higher dose of 15 mg is based on “weak evidence.” What’s more, they say, only a limited number of studies had compared fixed MTX doses head-to-head, and of those studies, many are 20 to 30 years old. No study had compared starting doses of 7.5 mg and 15 mg MTX, the researchers say.
Starting higher may have some benefits of efficacy, but that higher dose can also lead to adverse effects (AEs), intolerance, and withdrawal from therapy, say the researchers. They decided to find a balance between efficacy, speed, and tolerability by comparing 2 dosage regimens of oral MTX, starting at either 7.5 mgor 15 mg per week and escalating 2.5 mg every 2 weeks over 12 weeks, to a possible maximum of 25 mg per week. In group one, 47 patients were started on the lower dose, reaching a mean dose at 12 weeks of 17.3 mg per week. In group two, 53 patients were started on the higher dose and reached a mean dose of 23.6 mg per week. In patients who completed the study, the mean doses were 19.2 mg per week and 24.5 mg per week, respectively (P < .001).
Nine patients withdrew from group 1, and 7 patients withdrew from group 2. The numbers withdrawing from each group due to AEs were not statistically significant (P = .9). However, group 2 had a higher incidence of nausea and vomiting (42%, vs 19% in group 1), although the severity and duration of nausea were similar in both groups. There were no significant differences in frequency of cytopenia (P = .09) or transaminitis (P = .08). There was no difference in disease activity at weeks 4, 8, or 12.
The researchers say one limitation of their study is the short duration. They chose 12 weeks, because guidelines had suggested 3 months as a decision point, when other drugs could be added if the patient did not respond to MTX. However, they note that the European League Against Rheumatism 2013 update now specifies that the 3-month period relates “solely to assessing improvements” and says it takes 6 months to see maximal efficacy. The researchers, agreeing with this, say they found “a relatively poor response” by week 12. Indeed, they say, in view of the relatively slow decline in disease activity, future studies might benefit from extending the follow-up period to 24 weeks.
Source
Dhir V, Singla M, Gupta N, et al. Clin Ther. 2014;36(7):1005-1015.
doi: 10.1016/j.clinthera.2014.05.063.
Decades of experience have narrowed the most effective dose of methotrexate (MTX) for rheumatoid arthritis to somewhere between 15 mg and 25 mg per week. However, experience has also suggested that early and rapid control of the disease activity minimizes damage. The result has been a quicker escalation of MTX dosing, with a starting dose of 10 mg to 15 mg per week and escalating by 5 mg every month, rather than the more traditional 5 mg every 3 months.
But researchers from Post Graduate Institute of Medical Education and Research in Chandigarh, India, point out that the recommendation to start with the higher dose of 15 mg is based on “weak evidence.” What’s more, they say, only a limited number of studies had compared fixed MTX doses head-to-head, and of those studies, many are 20 to 30 years old. No study had compared starting doses of 7.5 mg and 15 mg MTX, the researchers say.
Starting higher may have some benefits of efficacy, but that higher dose can also lead to adverse effects (AEs), intolerance, and withdrawal from therapy, say the researchers. They decided to find a balance between efficacy, speed, and tolerability by comparing 2 dosage regimens of oral MTX, starting at either 7.5 mgor 15 mg per week and escalating 2.5 mg every 2 weeks over 12 weeks, to a possible maximum of 25 mg per week. In group one, 47 patients were started on the lower dose, reaching a mean dose at 12 weeks of 17.3 mg per week. In group two, 53 patients were started on the higher dose and reached a mean dose of 23.6 mg per week. In patients who completed the study, the mean doses were 19.2 mg per week and 24.5 mg per week, respectively (P < .001).
Nine patients withdrew from group 1, and 7 patients withdrew from group 2. The numbers withdrawing from each group due to AEs were not statistically significant (P = .9). However, group 2 had a higher incidence of nausea and vomiting (42%, vs 19% in group 1), although the severity and duration of nausea were similar in both groups. There were no significant differences in frequency of cytopenia (P = .09) or transaminitis (P = .08). There was no difference in disease activity at weeks 4, 8, or 12.
The researchers say one limitation of their study is the short duration. They chose 12 weeks, because guidelines had suggested 3 months as a decision point, when other drugs could be added if the patient did not respond to MTX. However, they note that the European League Against Rheumatism 2013 update now specifies that the 3-month period relates “solely to assessing improvements” and says it takes 6 months to see maximal efficacy. The researchers, agreeing with this, say they found “a relatively poor response” by week 12. Indeed, they say, in view of the relatively slow decline in disease activity, future studies might benefit from extending the follow-up period to 24 weeks.
Source
Dhir V, Singla M, Gupta N, et al. Clin Ther. 2014;36(7):1005-1015.
doi: 10.1016/j.clinthera.2014.05.063.
Decades of experience have narrowed the most effective dose of methotrexate (MTX) for rheumatoid arthritis to somewhere between 15 mg and 25 mg per week. However, experience has also suggested that early and rapid control of the disease activity minimizes damage. The result has been a quicker escalation of MTX dosing, with a starting dose of 10 mg to 15 mg per week and escalating by 5 mg every month, rather than the more traditional 5 mg every 3 months.
But researchers from Post Graduate Institute of Medical Education and Research in Chandigarh, India, point out that the recommendation to start with the higher dose of 15 mg is based on “weak evidence.” What’s more, they say, only a limited number of studies had compared fixed MTX doses head-to-head, and of those studies, many are 20 to 30 years old. No study had compared starting doses of 7.5 mg and 15 mg MTX, the researchers say.
Starting higher may have some benefits of efficacy, but that higher dose can also lead to adverse effects (AEs), intolerance, and withdrawal from therapy, say the researchers. They decided to find a balance between efficacy, speed, and tolerability by comparing 2 dosage regimens of oral MTX, starting at either 7.5 mgor 15 mg per week and escalating 2.5 mg every 2 weeks over 12 weeks, to a possible maximum of 25 mg per week. In group one, 47 patients were started on the lower dose, reaching a mean dose at 12 weeks of 17.3 mg per week. In group two, 53 patients were started on the higher dose and reached a mean dose of 23.6 mg per week. In patients who completed the study, the mean doses were 19.2 mg per week and 24.5 mg per week, respectively (P < .001).
Nine patients withdrew from group 1, and 7 patients withdrew from group 2. The numbers withdrawing from each group due to AEs were not statistically significant (P = .9). However, group 2 had a higher incidence of nausea and vomiting (42%, vs 19% in group 1), although the severity and duration of nausea were similar in both groups. There were no significant differences in frequency of cytopenia (P = .09) or transaminitis (P = .08). There was no difference in disease activity at weeks 4, 8, or 12.
The researchers say one limitation of their study is the short duration. They chose 12 weeks, because guidelines had suggested 3 months as a decision point, when other drugs could be added if the patient did not respond to MTX. However, they note that the European League Against Rheumatism 2013 update now specifies that the 3-month period relates “solely to assessing improvements” and says it takes 6 months to see maximal efficacy. The researchers, agreeing with this, say they found “a relatively poor response” by week 12. Indeed, they say, in view of the relatively slow decline in disease activity, future studies might benefit from extending the follow-up period to 24 weeks.
Source
Dhir V, Singla M, Gupta N, et al. Clin Ther. 2014;36(7):1005-1015.
doi: 10.1016/j.clinthera.2014.05.063.