User login
Medicare beneficiaries in hospice care get better care, have fewer costs
Medicare fee-for-service beneficiaries suffering from poor-prognosis cancer who received hospice care were found to have lower rates of hospitalizations, admissions to intensive care units, and invasive procedures than those who did not receive hospice care, according to a study published in JAMA.
“Our findings highlight the potential importance of frank discussions between physicians and patients about the realities of care at the end of life, an issue of particular importance as the Medicare administration weighs decisions around reimbursing physicians for advance care planning,” said Dr. Ziad Obermeyer of the emergency medicine department at Brigham and Women’s Hospital in Boston, and his associates.
In a matched cohort study, Dr. Obermeyer and his colleagues examined the records of 86,851 patients with poor-prognosis cancer – such as brain, pancreatic, and metastatic malignancies – using a nationally representative, 20% sample of Medicare fee-for-service beneficiaries who died in 2011. Of that group, 51,924 individuals (60%) entered hospice care prior to death, with the median time from first diagnosis to death being 13 months (JAMA 2014;312:1888-96).
The researchers then matched patients in hospice vs. nonhospice care, using factors such as age, sex, region, time from first diagnosis to death, and baseline care utilization. Each sample group consisted of 18,165 individuals, with the non–hospice-care group acting as the control. The median hospice duration for the hospice group was 11 days.
Dr. Obermeyer and his associates discovered that hospice beneficiaries had significantly lower rates of hospitalization (42%), intensive care unit admission (15%), invasive procedures (27%), and deaths in hospitals or nursing facilities (14%), compared with their nonhospice counterparts, who had a 65% rate of hospitalization, a 36% rate of intensive care unit admission, a 51% rate of invasive procedures, and a 74% rate of deaths in hospitals or nursing facilities.
Furthermore, the authors found that nonhospice beneficiaries had a higher rate of health care utilization, largely for acute conditions that were not directly related to their cancer, and higher overall costs. On average, costs for hospice beneficiaries were $62,819, while costs for nonhospice beneficiaries were $71,517.
“Hospice enrollment of 5 to 8 weeks produced the greatest savings; shorter stays produced fewer savings, likely because of both hospice initiation costs, and need for intensive symptom palliation in the days before death,” Dr. Obermeyer and his coauthors wrote. “Cost trajectories began to diverge in the week after hospice enrollment, implying that baseline differences between hospice and nonhospice beneficiaries were not responsible for cost differences,” they added.
The study was supported by grants from the National Institutes of Health, the National Cancer Institute, and the Agency for Healthcare Research and Quality. The authors reported no relevant conflicts of interest.
Although the study by Obermeyer et al. adds to the evidence regarding hospice care for patients with poor-prognosis cancer, several caveats should be considered. An important threat to the validity of this study was that the unobserved difference in preferences for aggressive care may explain the observed cost savings. Rightfully, the authors acknowledge this and other limitations, such as restriction of the study population to patients with cancer, exclusion of Medicare beneficiaries with managed care and non-Medicare patients, and reliance only on claims-based information for risk adjustments, Dr. Joan M. Teno and Pedro L. Gozalo, Ph.D., both of the Brown University School of Public Health, Providence, R.I., wrote in an editorial accompanying the study.
The findings from this study raise several important policy issues, they said. If hospice saves money, should health care policy promote increased hospice access? Perhaps an even larger policy issue involves the role of costs and not quality in driving U.S. health policy in care of the seriously ill and those at the close of life (JAMA 2014;312:1868-69).
The pressing policy issue in the United States involves not only patients dying of poor-prognosis cancers, but patients with noncancer chronic illness for whom the costs of prolonged hospice stays exceed the potential savings from hospitalizations. Even in that policy debate, focusing solely on expenditures is not warranted. That hospice or hospital-based palliative care teams save money is ethically defensible only if there is improvement in the quality of care and medical decisions are consistent with the informed patient’s wishes and goals of care.
Dr. Teno is a professor at the Brown University School of Public Health. Dr. Gozalo is an associate professor at the university.
Although the study by Obermeyer et al. adds to the evidence regarding hospice care for patients with poor-prognosis cancer, several caveats should be considered. An important threat to the validity of this study was that the unobserved difference in preferences for aggressive care may explain the observed cost savings. Rightfully, the authors acknowledge this and other limitations, such as restriction of the study population to patients with cancer, exclusion of Medicare beneficiaries with managed care and non-Medicare patients, and reliance only on claims-based information for risk adjustments, Dr. Joan M. Teno and Pedro L. Gozalo, Ph.D., both of the Brown University School of Public Health, Providence, R.I., wrote in an editorial accompanying the study.
The findings from this study raise several important policy issues, they said. If hospice saves money, should health care policy promote increased hospice access? Perhaps an even larger policy issue involves the role of costs and not quality in driving U.S. health policy in care of the seriously ill and those at the close of life (JAMA 2014;312:1868-69).
The pressing policy issue in the United States involves not only patients dying of poor-prognosis cancers, but patients with noncancer chronic illness for whom the costs of prolonged hospice stays exceed the potential savings from hospitalizations. Even in that policy debate, focusing solely on expenditures is not warranted. That hospice or hospital-based palliative care teams save money is ethically defensible only if there is improvement in the quality of care and medical decisions are consistent with the informed patient’s wishes and goals of care.
Dr. Teno is a professor at the Brown University School of Public Health. Dr. Gozalo is an associate professor at the university.
Although the study by Obermeyer et al. adds to the evidence regarding hospice care for patients with poor-prognosis cancer, several caveats should be considered. An important threat to the validity of this study was that the unobserved difference in preferences for aggressive care may explain the observed cost savings. Rightfully, the authors acknowledge this and other limitations, such as restriction of the study population to patients with cancer, exclusion of Medicare beneficiaries with managed care and non-Medicare patients, and reliance only on claims-based information for risk adjustments, Dr. Joan M. Teno and Pedro L. Gozalo, Ph.D., both of the Brown University School of Public Health, Providence, R.I., wrote in an editorial accompanying the study.
The findings from this study raise several important policy issues, they said. If hospice saves money, should health care policy promote increased hospice access? Perhaps an even larger policy issue involves the role of costs and not quality in driving U.S. health policy in care of the seriously ill and those at the close of life (JAMA 2014;312:1868-69).
The pressing policy issue in the United States involves not only patients dying of poor-prognosis cancers, but patients with noncancer chronic illness for whom the costs of prolonged hospice stays exceed the potential savings from hospitalizations. Even in that policy debate, focusing solely on expenditures is not warranted. That hospice or hospital-based palliative care teams save money is ethically defensible only if there is improvement in the quality of care and medical decisions are consistent with the informed patient’s wishes and goals of care.
Dr. Teno is a professor at the Brown University School of Public Health. Dr. Gozalo is an associate professor at the university.
Medicare fee-for-service beneficiaries suffering from poor-prognosis cancer who received hospice care were found to have lower rates of hospitalizations, admissions to intensive care units, and invasive procedures than those who did not receive hospice care, according to a study published in JAMA.
“Our findings highlight the potential importance of frank discussions between physicians and patients about the realities of care at the end of life, an issue of particular importance as the Medicare administration weighs decisions around reimbursing physicians for advance care planning,” said Dr. Ziad Obermeyer of the emergency medicine department at Brigham and Women’s Hospital in Boston, and his associates.
In a matched cohort study, Dr. Obermeyer and his colleagues examined the records of 86,851 patients with poor-prognosis cancer – such as brain, pancreatic, and metastatic malignancies – using a nationally representative, 20% sample of Medicare fee-for-service beneficiaries who died in 2011. Of that group, 51,924 individuals (60%) entered hospice care prior to death, with the median time from first diagnosis to death being 13 months (JAMA 2014;312:1888-96).
The researchers then matched patients in hospice vs. nonhospice care, using factors such as age, sex, region, time from first diagnosis to death, and baseline care utilization. Each sample group consisted of 18,165 individuals, with the non–hospice-care group acting as the control. The median hospice duration for the hospice group was 11 days.
Dr. Obermeyer and his associates discovered that hospice beneficiaries had significantly lower rates of hospitalization (42%), intensive care unit admission (15%), invasive procedures (27%), and deaths in hospitals or nursing facilities (14%), compared with their nonhospice counterparts, who had a 65% rate of hospitalization, a 36% rate of intensive care unit admission, a 51% rate of invasive procedures, and a 74% rate of deaths in hospitals or nursing facilities.
Furthermore, the authors found that nonhospice beneficiaries had a higher rate of health care utilization, largely for acute conditions that were not directly related to their cancer, and higher overall costs. On average, costs for hospice beneficiaries were $62,819, while costs for nonhospice beneficiaries were $71,517.
“Hospice enrollment of 5 to 8 weeks produced the greatest savings; shorter stays produced fewer savings, likely because of both hospice initiation costs, and need for intensive symptom palliation in the days before death,” Dr. Obermeyer and his coauthors wrote. “Cost trajectories began to diverge in the week after hospice enrollment, implying that baseline differences between hospice and nonhospice beneficiaries were not responsible for cost differences,” they added.
The study was supported by grants from the National Institutes of Health, the National Cancer Institute, and the Agency for Healthcare Research and Quality. The authors reported no relevant conflicts of interest.
Medicare fee-for-service beneficiaries suffering from poor-prognosis cancer who received hospice care were found to have lower rates of hospitalizations, admissions to intensive care units, and invasive procedures than those who did not receive hospice care, according to a study published in JAMA.
“Our findings highlight the potential importance of frank discussions between physicians and patients about the realities of care at the end of life, an issue of particular importance as the Medicare administration weighs decisions around reimbursing physicians for advance care planning,” said Dr. Ziad Obermeyer of the emergency medicine department at Brigham and Women’s Hospital in Boston, and his associates.
In a matched cohort study, Dr. Obermeyer and his colleagues examined the records of 86,851 patients with poor-prognosis cancer – such as brain, pancreatic, and metastatic malignancies – using a nationally representative, 20% sample of Medicare fee-for-service beneficiaries who died in 2011. Of that group, 51,924 individuals (60%) entered hospice care prior to death, with the median time from first diagnosis to death being 13 months (JAMA 2014;312:1888-96).
The researchers then matched patients in hospice vs. nonhospice care, using factors such as age, sex, region, time from first diagnosis to death, and baseline care utilization. Each sample group consisted of 18,165 individuals, with the non–hospice-care group acting as the control. The median hospice duration for the hospice group was 11 days.
Dr. Obermeyer and his associates discovered that hospice beneficiaries had significantly lower rates of hospitalization (42%), intensive care unit admission (15%), invasive procedures (27%), and deaths in hospitals or nursing facilities (14%), compared with their nonhospice counterparts, who had a 65% rate of hospitalization, a 36% rate of intensive care unit admission, a 51% rate of invasive procedures, and a 74% rate of deaths in hospitals or nursing facilities.
Furthermore, the authors found that nonhospice beneficiaries had a higher rate of health care utilization, largely for acute conditions that were not directly related to their cancer, and higher overall costs. On average, costs for hospice beneficiaries were $62,819, while costs for nonhospice beneficiaries were $71,517.
“Hospice enrollment of 5 to 8 weeks produced the greatest savings; shorter stays produced fewer savings, likely because of both hospice initiation costs, and need for intensive symptom palliation in the days before death,” Dr. Obermeyer and his coauthors wrote. “Cost trajectories began to diverge in the week after hospice enrollment, implying that baseline differences between hospice and nonhospice beneficiaries were not responsible for cost differences,” they added.
The study was supported by grants from the National Institutes of Health, the National Cancer Institute, and the Agency for Healthcare Research and Quality. The authors reported no relevant conflicts of interest.
FROM JAMA
Key clinical point: Medicare beneficiaries with poor-prognosis cancer who received hospice care had lower rates of hospitalization, ICU admission, and invasive procedures than those who did not.
Major finding: Of those receiving hospice care, 42% were admitted to the hospital vs. 65% of those not receiving hospice care.
Data source: Matched cohort study of Medicare fee-for-service beneficiaries.
Disclosures: The study was supported by grants from the National Institutes of Health, the National Cancer Institute, and the Agency for Healthcare Research and Quality. The authors reported no relevant conflicts of interest.
Fractional laser technology reduces facial acne scarring
Fractional laser technology, often used in the removal of unwanted tattoos, can improve the appearance and texture of facial acne scars, based on data published online Nov.19 in JAMA Dermatology.
“The evolution from traditional nanosecond to picosecond lasers has been observed to produce a photomechanical effect that causes fragmentation of tattoo ink or pigment,” wrote Dr. Jeremy A. Brauer, a dermatologist in group practice in New York, and his coinvestigators. “An innovative optical attachment for the picosecond laser, a diffractive lens array, has been developed that gauges distribution of energy to the treatment area. This specialized optic affects more surface area, has a greater pattern density per pulse, and may improve the appearance of acne scars,” they reported.
In a single-center, prospective study, Dr. Brauer and his associates enrolled 20 patients – 15 women and 5 men – based on screenings to ensure no history of skin cancer, keloidal scarring, localized or active infection, immunodeficiency disorders, and light hypersensitivity or use of medications with known phototoxic effects. Of that initial group, 17 completed all six treatments and presented for follow-up visits at 1 and 3 months. Patients were aged 27-66 (mean age, 44 years), and included Fitzpatrick skin types I (one patient), II (seven patients), III (six patients), and IV (three patients).
Subjects mostly had rolling-type scars, boxcar scars, and icepick lesions related to acne. Each subject underwent six treatments with a 755-nanometer alexandrite picosecond laser with a diffractive lens array; each treatment occurred every 4-8 weeks.
Subjects also provided a subjective score for pain experienced during each treatment on a scale of 0 (no pain) to 10 (extreme pain). The patients also used a scale of 0-4 to indicate their satisfaction with improvement of overall skin appearance and texture prior to their final treatment, at the 1-month follow-up, and at the 3-month follow-up (with 0 being total dissatisfaction and 4 total satisfaction).
At the 1-month and 3-month follow-ups, three independent dermatologists gave masked evaluations of each patient’s improvement on a 4-point scale, with 0 = 0%-25%, 1 = 26%-50%, 3 = 51%-75%, and 4 = 76%-100%.
All patients were either “satisfied” or “extremely satisfied” with the appearance and texture of their facial skin after receiving the full treatment regimen, and recorded an average pain score of 2.83 out of 10. The masked assessment scores also were favorable, averaging 1.5 of 3 and 1.4 of 3 at the 1-month and 3-month follow-ups, with a score of 0 indicating 0-25% improvement and a score of 3, greater than 75% improvement, the researchers reported.
Dr. Brauer and his associates also evaluated three-dimensional volumetric data for each subject, which showed an average of 24.0% improvement in scar volume at the 1-month follow-up and 27.2% at the 3-month follow-up. Furthermore, histologic analysis revealed elongation and increased density of elastic fibers, as well as an increase in dermal collagen and mucin.
“This is the first study, to our knowledge, that demonstrates favorable clinical outcomes in acne scar management with the 755[-nm] picosecond laser and diffractive lens array,” the researchers noted. “Observed improvement in pigmentation and texture of the surrounding skin suggests that there may be benefits for indications beyond scarring,” they wrote.
The authors disclosed that funding for this study was provided in part by Cynosure, manufacturer of the Food and Drug Administration–approved 755-nm picosecond alexandrite laser used in the study, and noted that “Cynosure had a role in the design of the study but not the conduct, collection, management, analysis, or interpretation of data. They approved the manuscript but did not prepare or decide to submit.” Dr. Brauer disclosed receiving honoraria from Cynosure/Palomar Medical Technologies and consulting for Miramar. Several other coauthors disclosed financial relationships with multiple companies, including Cynosure/Palomar Medical Technologies.
Fractional laser technology, often used in the removal of unwanted tattoos, can improve the appearance and texture of facial acne scars, based on data published online Nov.19 in JAMA Dermatology.
“The evolution from traditional nanosecond to picosecond lasers has been observed to produce a photomechanical effect that causes fragmentation of tattoo ink or pigment,” wrote Dr. Jeremy A. Brauer, a dermatologist in group practice in New York, and his coinvestigators. “An innovative optical attachment for the picosecond laser, a diffractive lens array, has been developed that gauges distribution of energy to the treatment area. This specialized optic affects more surface area, has a greater pattern density per pulse, and may improve the appearance of acne scars,” they reported.
In a single-center, prospective study, Dr. Brauer and his associates enrolled 20 patients – 15 women and 5 men – based on screenings to ensure no history of skin cancer, keloidal scarring, localized or active infection, immunodeficiency disorders, and light hypersensitivity or use of medications with known phototoxic effects. Of that initial group, 17 completed all six treatments and presented for follow-up visits at 1 and 3 months. Patients were aged 27-66 (mean age, 44 years), and included Fitzpatrick skin types I (one patient), II (seven patients), III (six patients), and IV (three patients).
Subjects mostly had rolling-type scars, boxcar scars, and icepick lesions related to acne. Each subject underwent six treatments with a 755-nanometer alexandrite picosecond laser with a diffractive lens array; each treatment occurred every 4-8 weeks.
Subjects also provided a subjective score for pain experienced during each treatment on a scale of 0 (no pain) to 10 (extreme pain). The patients also used a scale of 0-4 to indicate their satisfaction with improvement of overall skin appearance and texture prior to their final treatment, at the 1-month follow-up, and at the 3-month follow-up (with 0 being total dissatisfaction and 4 total satisfaction).
At the 1-month and 3-month follow-ups, three independent dermatologists gave masked evaluations of each patient’s improvement on a 4-point scale, with 0 = 0%-25%, 1 = 26%-50%, 3 = 51%-75%, and 4 = 76%-100%.
All patients were either “satisfied” or “extremely satisfied” with the appearance and texture of their facial skin after receiving the full treatment regimen, and recorded an average pain score of 2.83 out of 10. The masked assessment scores also were favorable, averaging 1.5 of 3 and 1.4 of 3 at the 1-month and 3-month follow-ups, with a score of 0 indicating 0-25% improvement and a score of 3, greater than 75% improvement, the researchers reported.
Dr. Brauer and his associates also evaluated three-dimensional volumetric data for each subject, which showed an average of 24.0% improvement in scar volume at the 1-month follow-up and 27.2% at the 3-month follow-up. Furthermore, histologic analysis revealed elongation and increased density of elastic fibers, as well as an increase in dermal collagen and mucin.
“This is the first study, to our knowledge, that demonstrates favorable clinical outcomes in acne scar management with the 755[-nm] picosecond laser and diffractive lens array,” the researchers noted. “Observed improvement in pigmentation and texture of the surrounding skin suggests that there may be benefits for indications beyond scarring,” they wrote.
The authors disclosed that funding for this study was provided in part by Cynosure, manufacturer of the Food and Drug Administration–approved 755-nm picosecond alexandrite laser used in the study, and noted that “Cynosure had a role in the design of the study but not the conduct, collection, management, analysis, or interpretation of data. They approved the manuscript but did not prepare or decide to submit.” Dr. Brauer disclosed receiving honoraria from Cynosure/Palomar Medical Technologies and consulting for Miramar. Several other coauthors disclosed financial relationships with multiple companies, including Cynosure/Palomar Medical Technologies.
Fractional laser technology, often used in the removal of unwanted tattoos, can improve the appearance and texture of facial acne scars, based on data published online Nov.19 in JAMA Dermatology.
“The evolution from traditional nanosecond to picosecond lasers has been observed to produce a photomechanical effect that causes fragmentation of tattoo ink or pigment,” wrote Dr. Jeremy A. Brauer, a dermatologist in group practice in New York, and his coinvestigators. “An innovative optical attachment for the picosecond laser, a diffractive lens array, has been developed that gauges distribution of energy to the treatment area. This specialized optic affects more surface area, has a greater pattern density per pulse, and may improve the appearance of acne scars,” they reported.
In a single-center, prospective study, Dr. Brauer and his associates enrolled 20 patients – 15 women and 5 men – based on screenings to ensure no history of skin cancer, keloidal scarring, localized or active infection, immunodeficiency disorders, and light hypersensitivity or use of medications with known phototoxic effects. Of that initial group, 17 completed all six treatments and presented for follow-up visits at 1 and 3 months. Patients were aged 27-66 (mean age, 44 years), and included Fitzpatrick skin types I (one patient), II (seven patients), III (six patients), and IV (three patients).
Subjects mostly had rolling-type scars, boxcar scars, and icepick lesions related to acne. Each subject underwent six treatments with a 755-nanometer alexandrite picosecond laser with a diffractive lens array; each treatment occurred every 4-8 weeks.
Subjects also provided a subjective score for pain experienced during each treatment on a scale of 0 (no pain) to 10 (extreme pain). The patients also used a scale of 0-4 to indicate their satisfaction with improvement of overall skin appearance and texture prior to their final treatment, at the 1-month follow-up, and at the 3-month follow-up (with 0 being total dissatisfaction and 4 total satisfaction).
At the 1-month and 3-month follow-ups, three independent dermatologists gave masked evaluations of each patient’s improvement on a 4-point scale, with 0 = 0%-25%, 1 = 26%-50%, 3 = 51%-75%, and 4 = 76%-100%.
All patients were either “satisfied” or “extremely satisfied” with the appearance and texture of their facial skin after receiving the full treatment regimen, and recorded an average pain score of 2.83 out of 10. The masked assessment scores also were favorable, averaging 1.5 of 3 and 1.4 of 3 at the 1-month and 3-month follow-ups, with a score of 0 indicating 0-25% improvement and a score of 3, greater than 75% improvement, the researchers reported.
Dr. Brauer and his associates also evaluated three-dimensional volumetric data for each subject, which showed an average of 24.0% improvement in scar volume at the 1-month follow-up and 27.2% at the 3-month follow-up. Furthermore, histologic analysis revealed elongation and increased density of elastic fibers, as well as an increase in dermal collagen and mucin.
“This is the first study, to our knowledge, that demonstrates favorable clinical outcomes in acne scar management with the 755[-nm] picosecond laser and diffractive lens array,” the researchers noted. “Observed improvement in pigmentation and texture of the surrounding skin suggests that there may be benefits for indications beyond scarring,” they wrote.
The authors disclosed that funding for this study was provided in part by Cynosure, manufacturer of the Food and Drug Administration–approved 755-nm picosecond alexandrite laser used in the study, and noted that “Cynosure had a role in the design of the study but not the conduct, collection, management, analysis, or interpretation of data. They approved the manuscript but did not prepare or decide to submit.” Dr. Brauer disclosed receiving honoraria from Cynosure/Palomar Medical Technologies and consulting for Miramar. Several other coauthors disclosed financial relationships with multiple companies, including Cynosure/Palomar Medical Technologies.
FROM JAMA DERMATOLOGY
Key clinical point: Treatment of facial acne scars with a diffractive lens array and a picosecond 755-nm alexandrite laser improves the appearance and texture of skin within 3 months.
Major finding: Masked assessments by a dermatologist found a 25%-50% global improvement at the 1-month postoperation follow-up visit, which was maintained at the 3-month follow-up.
Data source: A single-center, prospective study of 20 patients.
Disclosures: This study was supported in part by Cynosure. The authors disclosed several potential conflicts of interest.
Telemedicine-based collaborative care benefits rural veterans with PTSD
U.S. military veterans living in rural areas who engage in evidence-based psychotherapy and telemedicine-based collaborative care can significantly increase their chances of improving outcomes related to posttraumatic stress disorder, according to a new study.
“Although psychotherapy and pharmacotherapy treatments for PTSD have proven to be efficacious in randomized clinical trials and have been disseminated widely by the [Veterans Health Administration], stigma and geographic barriers often prevent rural veterans from engaging in these evidence-based treatments,” says the study, published in JAMA (2014 Nov. 19 [doi.101001/jamapsychiatry.2014.1575]) and led by John C. Fortney, Ph.D., of the University of Washington’s department of psychiatry and behavioral sciences in Seattle, and his associates.
In a pragmatic, randomized effectiveness trial, Dr. Fortney and his associates recruited outpatients from 11 Department of Veterans Affairs (VA) community-based outpatient clinics (CBOCs) in predominantly rural areas of the United States over the course of 22 months. A total of 265 patients completed baseline interviews and randomization after meeting eligibility criteria, which consisted of meeting diagnostic standards for PTSD; having no medical history of schizophrenia, bipolar disorder, substance dependence, or hearing impairment; having a telephone; not having a life-threatening illness; and lacking capacity to consent.
The 265 subjects were randomized into one of two groups: those receiving usual care (UC), or those receiving the Telemedicine Outreach for PTSD (TOP) treatment developed by the investigators. Patients were mostly unemployed, middle-aged men with severe PTSD symptoms and “other mental health coexisting illnesses,” according to a press release.
Subjects in the UC group received certain health care services, such as psychotropic medications for PTSD prescribed by psychiatrists, evidence-based psychotherapy for PTSD delivered by psychologists or social workers, supportive PTSD-focused therapy delivered by psychologists or social workers (individual and group), and supportive therapy delivered by social workers (individual and group), among others. Subjects in the TOP group, however, received the attention of telephone nurse care managers, including PTSD symptom monitoring and medication regimen adherence monitoring and promotion. In addition, those in the TOP group received access to a telephone pharmacist and telepsychologist.
Subjects were enrolled in a series of 12 cognitive processing therapy sessions for the duration of the study, from Nov. 23, 2009, through Sept. 28, 2011, and attendance was taken at each session. After the sessions concluded, subjects were then followed up on for 12 months.
During that follow-up period after treatments ended, patients who received TOP had significantly larger decreases in Posttraumatic Diagnostic Scale (PDS) scores (from 35.0 to 29.1), compared with those from the UC group (from 33.5 to 32.1) at 6 months (beta = −3.81; P = .002) and 12 months (beta = −2.49; P = .04). At 12 months, TOP subjects also had significantly larger decreases in PDS scores (from 35.0 to 30.1), compared with those who received UC (from 33.5 to 29.1) .
Subjects who attended at least eight cognitive processing therapy sessions were more likely to improve their PDS scores (beta = −3.86 [95% confidence interval, −7.19 to −0.54]; P = .02). However, the authors noted that there were “no significant group differences in the number of PTSD medications prescribed and adherence to medication regimens” was not significant.
“This trial introduces a promising model for managing PTSD in a treatment-resistant population,” Dr. Fortney and his associates wrote. “Findings suggest that telemedicine-based collaborative care can successfully engage this population in evidence-based psychotherapy for PTSD, thereby improving clinical outcomes.”
Among the study limitations cited by the investigators is that the PDS was administered to assess PTSD, rather than the Clinician-Administered PTSD Scale, which is the reference standard.
The authors reported no relevant financial conflicts of interest.
U.S. military veterans living in rural areas who engage in evidence-based psychotherapy and telemedicine-based collaborative care can significantly increase their chances of improving outcomes related to posttraumatic stress disorder, according to a new study.
“Although psychotherapy and pharmacotherapy treatments for PTSD have proven to be efficacious in randomized clinical trials and have been disseminated widely by the [Veterans Health Administration], stigma and geographic barriers often prevent rural veterans from engaging in these evidence-based treatments,” says the study, published in JAMA (2014 Nov. 19 [doi.101001/jamapsychiatry.2014.1575]) and led by John C. Fortney, Ph.D., of the University of Washington’s department of psychiatry and behavioral sciences in Seattle, and his associates.
In a pragmatic, randomized effectiveness trial, Dr. Fortney and his associates recruited outpatients from 11 Department of Veterans Affairs (VA) community-based outpatient clinics (CBOCs) in predominantly rural areas of the United States over the course of 22 months. A total of 265 patients completed baseline interviews and randomization after meeting eligibility criteria, which consisted of meeting diagnostic standards for PTSD; having no medical history of schizophrenia, bipolar disorder, substance dependence, or hearing impairment; having a telephone; not having a life-threatening illness; and lacking capacity to consent.
The 265 subjects were randomized into one of two groups: those receiving usual care (UC), or those receiving the Telemedicine Outreach for PTSD (TOP) treatment developed by the investigators. Patients were mostly unemployed, middle-aged men with severe PTSD symptoms and “other mental health coexisting illnesses,” according to a press release.
Subjects in the UC group received certain health care services, such as psychotropic medications for PTSD prescribed by psychiatrists, evidence-based psychotherapy for PTSD delivered by psychologists or social workers, supportive PTSD-focused therapy delivered by psychologists or social workers (individual and group), and supportive therapy delivered by social workers (individual and group), among others. Subjects in the TOP group, however, received the attention of telephone nurse care managers, including PTSD symptom monitoring and medication regimen adherence monitoring and promotion. In addition, those in the TOP group received access to a telephone pharmacist and telepsychologist.
Subjects were enrolled in a series of 12 cognitive processing therapy sessions for the duration of the study, from Nov. 23, 2009, through Sept. 28, 2011, and attendance was taken at each session. After the sessions concluded, subjects were then followed up on for 12 months.
During that follow-up period after treatments ended, patients who received TOP had significantly larger decreases in Posttraumatic Diagnostic Scale (PDS) scores (from 35.0 to 29.1), compared with those from the UC group (from 33.5 to 32.1) at 6 months (beta = −3.81; P = .002) and 12 months (beta = −2.49; P = .04). At 12 months, TOP subjects also had significantly larger decreases in PDS scores (from 35.0 to 30.1), compared with those who received UC (from 33.5 to 29.1) .
Subjects who attended at least eight cognitive processing therapy sessions were more likely to improve their PDS scores (beta = −3.86 [95% confidence interval, −7.19 to −0.54]; P = .02). However, the authors noted that there were “no significant group differences in the number of PTSD medications prescribed and adherence to medication regimens” was not significant.
“This trial introduces a promising model for managing PTSD in a treatment-resistant population,” Dr. Fortney and his associates wrote. “Findings suggest that telemedicine-based collaborative care can successfully engage this population in evidence-based psychotherapy for PTSD, thereby improving clinical outcomes.”
Among the study limitations cited by the investigators is that the PDS was administered to assess PTSD, rather than the Clinician-Administered PTSD Scale, which is the reference standard.
The authors reported no relevant financial conflicts of interest.
U.S. military veterans living in rural areas who engage in evidence-based psychotherapy and telemedicine-based collaborative care can significantly increase their chances of improving outcomes related to posttraumatic stress disorder, according to a new study.
“Although psychotherapy and pharmacotherapy treatments for PTSD have proven to be efficacious in randomized clinical trials and have been disseminated widely by the [Veterans Health Administration], stigma and geographic barriers often prevent rural veterans from engaging in these evidence-based treatments,” says the study, published in JAMA (2014 Nov. 19 [doi.101001/jamapsychiatry.2014.1575]) and led by John C. Fortney, Ph.D., of the University of Washington’s department of psychiatry and behavioral sciences in Seattle, and his associates.
In a pragmatic, randomized effectiveness trial, Dr. Fortney and his associates recruited outpatients from 11 Department of Veterans Affairs (VA) community-based outpatient clinics (CBOCs) in predominantly rural areas of the United States over the course of 22 months. A total of 265 patients completed baseline interviews and randomization after meeting eligibility criteria, which consisted of meeting diagnostic standards for PTSD; having no medical history of schizophrenia, bipolar disorder, substance dependence, or hearing impairment; having a telephone; not having a life-threatening illness; and lacking capacity to consent.
The 265 subjects were randomized into one of two groups: those receiving usual care (UC), or those receiving the Telemedicine Outreach for PTSD (TOP) treatment developed by the investigators. Patients were mostly unemployed, middle-aged men with severe PTSD symptoms and “other mental health coexisting illnesses,” according to a press release.
Subjects in the UC group received certain health care services, such as psychotropic medications for PTSD prescribed by psychiatrists, evidence-based psychotherapy for PTSD delivered by psychologists or social workers, supportive PTSD-focused therapy delivered by psychologists or social workers (individual and group), and supportive therapy delivered by social workers (individual and group), among others. Subjects in the TOP group, however, received the attention of telephone nurse care managers, including PTSD symptom monitoring and medication regimen adherence monitoring and promotion. In addition, those in the TOP group received access to a telephone pharmacist and telepsychologist.
Subjects were enrolled in a series of 12 cognitive processing therapy sessions for the duration of the study, from Nov. 23, 2009, through Sept. 28, 2011, and attendance was taken at each session. After the sessions concluded, subjects were then followed up on for 12 months.
During that follow-up period after treatments ended, patients who received TOP had significantly larger decreases in Posttraumatic Diagnostic Scale (PDS) scores (from 35.0 to 29.1), compared with those from the UC group (from 33.5 to 32.1) at 6 months (beta = −3.81; P = .002) and 12 months (beta = −2.49; P = .04). At 12 months, TOP subjects also had significantly larger decreases in PDS scores (from 35.0 to 30.1), compared with those who received UC (from 33.5 to 29.1) .
Subjects who attended at least eight cognitive processing therapy sessions were more likely to improve their PDS scores (beta = −3.86 [95% confidence interval, −7.19 to −0.54]; P = .02). However, the authors noted that there were “no significant group differences in the number of PTSD medications prescribed and adherence to medication regimens” was not significant.
“This trial introduces a promising model for managing PTSD in a treatment-resistant population,” Dr. Fortney and his associates wrote. “Findings suggest that telemedicine-based collaborative care can successfully engage this population in evidence-based psychotherapy for PTSD, thereby improving clinical outcomes.”
Among the study limitations cited by the investigators is that the PDS was administered to assess PTSD, rather than the Clinician-Administered PTSD Scale, which is the reference standard.
The authors reported no relevant financial conflicts of interest.
FROM JAMA
Key clinical point: Collaborative care models can “encourage veterans to intiate and adhere to evidence-based psychotherapies for PTSD.”
Major finding: Veterans receiving Telemedicine Outreach for PTSD had significantly larger decreases in Posttraumatic Diagnostic Scale scores (from 35.0 to 29.1), compared with those receiving usual care (from 33.5 to 32.1) at 6 (beta = −3.81; P = .002) and 12 (beta = −2.49; P = .04) months.
Data source: A multisite pragmatic, randomized effectiveness trial developed by the Veterans Health Administration and the National Institute of Mental Health.
Disclosures: The authors reported no financial conflicts of interest.
No difference between losartan and atenolol treatments for aortic-root dilation
Treatment with losartan instead of more conventional atenolol yielded no significant difference in aortic-root dilation in children and young adults with Marfan’s syndrome over 3 years, according to a new study published in the New England Journal of Medicine and presented simultaneously at the American Heart Association’s Scientific Sessions.
In a randomized trial, researchers identified 608 children and young adults with Marfan’s syndrome and randomized them to treatment with either losartan (267) or atenolol (268) for 3 years. The participants were between the ages of 6 months and 25 years, had been diagnosed with Marfan’s syndrome according to the original Ghent criteria, and possessed a z score of 3.0 for maximum aortic-root diameter indexed to body surface area. The baseline subgroups were predefined as aortic-root z score being less than 4.5 vs. at or greater than 4.5, young adult vs. child, and previous use of beta-blocker (yes vs. no).
The results indicate that the baseline-adjusted annual rate of change in the aortic-root z score did not differ substantially between the losartan and atenolol groups: –0.107 ± 0.013 and –0.139 ± 0.013 standard deviation units, respectively (P = .08). Both data groups were significantly less than 0, however, indicating that both treatments were effective in decreasing aortic-root z scores. Younger patients in both groups experienced greater decreases in aortic-root z scores, but one was not significantly more than the other (P = .002 and P < .001 for losartan and atenolol, respectively).
“We did not find the expected advantage of angiotensin-receptor blockade therapy over beta-blocker therapy,” wrote the authors, led by Dr. Ronald V. Lacro of Boston Children’s Hospital’s Department of Cardiology (N. Engl. J. Med. 2014 [doi:10.1056/NEJMoa1404731]).
There were “small but significant differences favoring atenolol in the average annual change in the absolute diameter and z score for the aortic annulus, but there were no significant differences in the diameter or z score for the ascending aorta,” according to the investigators.
“This finding was unexpected, without a clear physiological explanation,” wrote Dr. Lacro and his associates.
The study was supported by a grant from the National Institute of Health’s National Heart, Lung, and Blood Institute. The authors reported no other financial conflicts of interest.
The study by Dr. Lacro and his colleagues is a large trial evaluating angiotensin-receptor blockade in patients with Marfan’s syndrome. Many expected the trial to confirm the superiority of losartan over atenolol in reducing rates of aortic growth. However, the trial showed no benefit in the rate of aortic dilatation when losartan was compared with atenolol over a 3-year period. The critical question is whether this finding argues for the rejection of losartan as a therapeutic option, or whether the study design masked its true benefit. “We believe the answer is, ‘Let’s wait and see,’ ” noted Dr. Juan M. Bowen and Dr. Heidi M. Connolly in an editorial accompanying the research report (N. Engl. J. Med. 2014 [doi:10.1056/NEJMe1412950]).
The results will stimulate healthy discussion about future directions in research and treatment. These findings indicate that clinicians should continue to consider beta-blockers to be the primary medical therapy for aortic protection in Marfan’s syndrome. Losartan appears to be a reasonable treatment option, especially in patients who cannot take beta-blockers. The risk of harm from losartan appears to be very low, but its efficacy needs to be firmly established before it becomes a first-line therapy, they noted.
Dr. Bowen and Dr. Connolly are both affiliated with the divisions of primary care internal medicine and cardiovascular diseases at the Mayo Clinic in Rochester, Minn. They did not report any relevant financial conflicts of interest.
The study by Dr. Lacro and his colleagues is a large trial evaluating angiotensin-receptor blockade in patients with Marfan’s syndrome. Many expected the trial to confirm the superiority of losartan over atenolol in reducing rates of aortic growth. However, the trial showed no benefit in the rate of aortic dilatation when losartan was compared with atenolol over a 3-year period. The critical question is whether this finding argues for the rejection of losartan as a therapeutic option, or whether the study design masked its true benefit. “We believe the answer is, ‘Let’s wait and see,’ ” noted Dr. Juan M. Bowen and Dr. Heidi M. Connolly in an editorial accompanying the research report (N. Engl. J. Med. 2014 [doi:10.1056/NEJMe1412950]).
The results will stimulate healthy discussion about future directions in research and treatment. These findings indicate that clinicians should continue to consider beta-blockers to be the primary medical therapy for aortic protection in Marfan’s syndrome. Losartan appears to be a reasonable treatment option, especially in patients who cannot take beta-blockers. The risk of harm from losartan appears to be very low, but its efficacy needs to be firmly established before it becomes a first-line therapy, they noted.
Dr. Bowen and Dr. Connolly are both affiliated with the divisions of primary care internal medicine and cardiovascular diseases at the Mayo Clinic in Rochester, Minn. They did not report any relevant financial conflicts of interest.
The study by Dr. Lacro and his colleagues is a large trial evaluating angiotensin-receptor blockade in patients with Marfan’s syndrome. Many expected the trial to confirm the superiority of losartan over atenolol in reducing rates of aortic growth. However, the trial showed no benefit in the rate of aortic dilatation when losartan was compared with atenolol over a 3-year period. The critical question is whether this finding argues for the rejection of losartan as a therapeutic option, or whether the study design masked its true benefit. “We believe the answer is, ‘Let’s wait and see,’ ” noted Dr. Juan M. Bowen and Dr. Heidi M. Connolly in an editorial accompanying the research report (N. Engl. J. Med. 2014 [doi:10.1056/NEJMe1412950]).
The results will stimulate healthy discussion about future directions in research and treatment. These findings indicate that clinicians should continue to consider beta-blockers to be the primary medical therapy for aortic protection in Marfan’s syndrome. Losartan appears to be a reasonable treatment option, especially in patients who cannot take beta-blockers. The risk of harm from losartan appears to be very low, but its efficacy needs to be firmly established before it becomes a first-line therapy, they noted.
Dr. Bowen and Dr. Connolly are both affiliated with the divisions of primary care internal medicine and cardiovascular diseases at the Mayo Clinic in Rochester, Minn. They did not report any relevant financial conflicts of interest.
Treatment with losartan instead of more conventional atenolol yielded no significant difference in aortic-root dilation in children and young adults with Marfan’s syndrome over 3 years, according to a new study published in the New England Journal of Medicine and presented simultaneously at the American Heart Association’s Scientific Sessions.
In a randomized trial, researchers identified 608 children and young adults with Marfan’s syndrome and randomized them to treatment with either losartan (267) or atenolol (268) for 3 years. The participants were between the ages of 6 months and 25 years, had been diagnosed with Marfan’s syndrome according to the original Ghent criteria, and possessed a z score of 3.0 for maximum aortic-root diameter indexed to body surface area. The baseline subgroups were predefined as aortic-root z score being less than 4.5 vs. at or greater than 4.5, young adult vs. child, and previous use of beta-blocker (yes vs. no).
The results indicate that the baseline-adjusted annual rate of change in the aortic-root z score did not differ substantially between the losartan and atenolol groups: –0.107 ± 0.013 and –0.139 ± 0.013 standard deviation units, respectively (P = .08). Both data groups were significantly less than 0, however, indicating that both treatments were effective in decreasing aortic-root z scores. Younger patients in both groups experienced greater decreases in aortic-root z scores, but one was not significantly more than the other (P = .002 and P < .001 for losartan and atenolol, respectively).
“We did not find the expected advantage of angiotensin-receptor blockade therapy over beta-blocker therapy,” wrote the authors, led by Dr. Ronald V. Lacro of Boston Children’s Hospital’s Department of Cardiology (N. Engl. J. Med. 2014 [doi:10.1056/NEJMoa1404731]).
There were “small but significant differences favoring atenolol in the average annual change in the absolute diameter and z score for the aortic annulus, but there were no significant differences in the diameter or z score for the ascending aorta,” according to the investigators.
“This finding was unexpected, without a clear physiological explanation,” wrote Dr. Lacro and his associates.
The study was supported by a grant from the National Institute of Health’s National Heart, Lung, and Blood Institute. The authors reported no other financial conflicts of interest.
Treatment with losartan instead of more conventional atenolol yielded no significant difference in aortic-root dilation in children and young adults with Marfan’s syndrome over 3 years, according to a new study published in the New England Journal of Medicine and presented simultaneously at the American Heart Association’s Scientific Sessions.
In a randomized trial, researchers identified 608 children and young adults with Marfan’s syndrome and randomized them to treatment with either losartan (267) or atenolol (268) for 3 years. The participants were between the ages of 6 months and 25 years, had been diagnosed with Marfan’s syndrome according to the original Ghent criteria, and possessed a z score of 3.0 for maximum aortic-root diameter indexed to body surface area. The baseline subgroups were predefined as aortic-root z score being less than 4.5 vs. at or greater than 4.5, young adult vs. child, and previous use of beta-blocker (yes vs. no).
The results indicate that the baseline-adjusted annual rate of change in the aortic-root z score did not differ substantially between the losartan and atenolol groups: –0.107 ± 0.013 and –0.139 ± 0.013 standard deviation units, respectively (P = .08). Both data groups were significantly less than 0, however, indicating that both treatments were effective in decreasing aortic-root z scores. Younger patients in both groups experienced greater decreases in aortic-root z scores, but one was not significantly more than the other (P = .002 and P < .001 for losartan and atenolol, respectively).
“We did not find the expected advantage of angiotensin-receptor blockade therapy over beta-blocker therapy,” wrote the authors, led by Dr. Ronald V. Lacro of Boston Children’s Hospital’s Department of Cardiology (N. Engl. J. Med. 2014 [doi:10.1056/NEJMoa1404731]).
There were “small but significant differences favoring atenolol in the average annual change in the absolute diameter and z score for the aortic annulus, but there were no significant differences in the diameter or z score for the ascending aorta,” according to the investigators.
“This finding was unexpected, without a clear physiological explanation,” wrote Dr. Lacro and his associates.
The study was supported by a grant from the National Institute of Health’s National Heart, Lung, and Blood Institute. The authors reported no other financial conflicts of interest.
FROM THE AHA SCIENTIFIC SESSIONS
Key clinical point: No significant difference in aortic-root dilation was found in children and young adults with Marfan’s syndrome treated with losartan or atenolol over a 3-year period.
Major finding: Annual rate of change in aortic-root z score did not differ significantly between losartan and atenolol groups: –0.107 ± 0.013 and –0.139 ± 0.013 standard deviation units, respectively (P = .08).
Data source: Randomized trial of 608 people with Marfan’s syndrome.
Disclosures: The study was supported by a grant from the National Institute of Health’s National Heart, Lung, and Blood Institute. The authors reported no other financial conflicts of interest.
CABG beats PCI for revascularization of diabetics
Coronary artery bypass grafting is the preferred long-term revascularization technique for diabetics, based on data from a meta-analysis published in the Annals of Internal Medicine (2014;161:724-32 [doi:10.7326/M14-0808]).
“With more than 1 million revascularization procedures done annually in the United States alone, assessing the risks and benefits of these techniques in this subgroup is a public health priority,” said Dr. Benny Tu of Greenslopes Private Hospital in Queensland, Australia, and his colleagues. “In particular, deciding on an optimal revascularization strategy is a crucial element of clinical decision making,” the researchers wrote.
The researchers used a Bayesian network meta-analysis to combine 40 studies from English-language publications such as PubMed, the Cochrane Central Register of Controlled Trials, Ovid, and EMBASE that took place between Jan. 1, 1990, and June 1, 2014. Each of these studies was a randomized, controlled trial comparing the effects of percutaneous coronary intervention (PCI), including PCIs with bare-metal stents (PCI-BMS) and those with drug-eluding stents (PCI-DES), with coronary artery bypass grafting (CABG) in adult diabetics with either multivessel or left main coronary artery disease.
The primary outcome combination of stroke, nonfatal MI, and all-cause mortality was 33% more likely in patients who underwent PCI (odds ratio, 1.33) as opposed to CABG. PCI also was associated with significant increase of 44% in mortality (OR, 1.44), and a 44% decrease in stroke.
“The largest advantage of CABG is in avoiding repeated revascularization,” the researchers noted; the need for repeated revascularization was 137% higher with PCI than with CABG.
Researchers extracted data relevant to study design, quality, patient characteristics, length of postprocedure follow-ups, and overall outcomes to determine which procedure proved most effective at mitigating mortality and the need for repeated revascularization. For duplicate publications, outcomes were obtained from the publication with the longest follow-up.
“This trial showed that CABG significantly decreased mortality in patients with diabetes at 5 years compared with percutaneous transluminal coronary angioplasty,” the researchers noted, adding that the findings confirm the CABG endorsement published in the 1997 Bypass Angioplasty Revascularization Investigation.
However, as techniques for both PCI and CABG have improved considerably between 1990 and 2014, the researchers advised physicians to review each option on a case-by-case basis, and to consider that PCI may in fact be the preferred option for some high-risk patients.
“Although CABG may generally be preferred, there are individual clinical situations in which PCI may be a reasonable alternative,” they wrote. “For example, it might be preferred for patients at high risk for perioperative stroke or whose long-term survival is compromised because of noncardiac factors.”
The study’s primary source of funding was the Fonds de recherche du Québec-Santé, but none of the researchers reported relevant financial disclosures.
Coronary artery bypass grafting is the preferred long-term revascularization technique for diabetics, based on data from a meta-analysis published in the Annals of Internal Medicine (2014;161:724-32 [doi:10.7326/M14-0808]).
“With more than 1 million revascularization procedures done annually in the United States alone, assessing the risks and benefits of these techniques in this subgroup is a public health priority,” said Dr. Benny Tu of Greenslopes Private Hospital in Queensland, Australia, and his colleagues. “In particular, deciding on an optimal revascularization strategy is a crucial element of clinical decision making,” the researchers wrote.
The researchers used a Bayesian network meta-analysis to combine 40 studies from English-language publications such as PubMed, the Cochrane Central Register of Controlled Trials, Ovid, and EMBASE that took place between Jan. 1, 1990, and June 1, 2014. Each of these studies was a randomized, controlled trial comparing the effects of percutaneous coronary intervention (PCI), including PCIs with bare-metal stents (PCI-BMS) and those with drug-eluding stents (PCI-DES), with coronary artery bypass grafting (CABG) in adult diabetics with either multivessel or left main coronary artery disease.
The primary outcome combination of stroke, nonfatal MI, and all-cause mortality was 33% more likely in patients who underwent PCI (odds ratio, 1.33) as opposed to CABG. PCI also was associated with significant increase of 44% in mortality (OR, 1.44), and a 44% decrease in stroke.
“The largest advantage of CABG is in avoiding repeated revascularization,” the researchers noted; the need for repeated revascularization was 137% higher with PCI than with CABG.
Researchers extracted data relevant to study design, quality, patient characteristics, length of postprocedure follow-ups, and overall outcomes to determine which procedure proved most effective at mitigating mortality and the need for repeated revascularization. For duplicate publications, outcomes were obtained from the publication with the longest follow-up.
“This trial showed that CABG significantly decreased mortality in patients with diabetes at 5 years compared with percutaneous transluminal coronary angioplasty,” the researchers noted, adding that the findings confirm the CABG endorsement published in the 1997 Bypass Angioplasty Revascularization Investigation.
However, as techniques for both PCI and CABG have improved considerably between 1990 and 2014, the researchers advised physicians to review each option on a case-by-case basis, and to consider that PCI may in fact be the preferred option for some high-risk patients.
“Although CABG may generally be preferred, there are individual clinical situations in which PCI may be a reasonable alternative,” they wrote. “For example, it might be preferred for patients at high risk for perioperative stroke or whose long-term survival is compromised because of noncardiac factors.”
The study’s primary source of funding was the Fonds de recherche du Québec-Santé, but none of the researchers reported relevant financial disclosures.
Coronary artery bypass grafting is the preferred long-term revascularization technique for diabetics, based on data from a meta-analysis published in the Annals of Internal Medicine (2014;161:724-32 [doi:10.7326/M14-0808]).
“With more than 1 million revascularization procedures done annually in the United States alone, assessing the risks and benefits of these techniques in this subgroup is a public health priority,” said Dr. Benny Tu of Greenslopes Private Hospital in Queensland, Australia, and his colleagues. “In particular, deciding on an optimal revascularization strategy is a crucial element of clinical decision making,” the researchers wrote.
The researchers used a Bayesian network meta-analysis to combine 40 studies from English-language publications such as PubMed, the Cochrane Central Register of Controlled Trials, Ovid, and EMBASE that took place between Jan. 1, 1990, and June 1, 2014. Each of these studies was a randomized, controlled trial comparing the effects of percutaneous coronary intervention (PCI), including PCIs with bare-metal stents (PCI-BMS) and those with drug-eluding stents (PCI-DES), with coronary artery bypass grafting (CABG) in adult diabetics with either multivessel or left main coronary artery disease.
The primary outcome combination of stroke, nonfatal MI, and all-cause mortality was 33% more likely in patients who underwent PCI (odds ratio, 1.33) as opposed to CABG. PCI also was associated with significant increase of 44% in mortality (OR, 1.44), and a 44% decrease in stroke.
“The largest advantage of CABG is in avoiding repeated revascularization,” the researchers noted; the need for repeated revascularization was 137% higher with PCI than with CABG.
Researchers extracted data relevant to study design, quality, patient characteristics, length of postprocedure follow-ups, and overall outcomes to determine which procedure proved most effective at mitigating mortality and the need for repeated revascularization. For duplicate publications, outcomes were obtained from the publication with the longest follow-up.
“This trial showed that CABG significantly decreased mortality in patients with diabetes at 5 years compared with percutaneous transluminal coronary angioplasty,” the researchers noted, adding that the findings confirm the CABG endorsement published in the 1997 Bypass Angioplasty Revascularization Investigation.
However, as techniques for both PCI and CABG have improved considerably between 1990 and 2014, the researchers advised physicians to review each option on a case-by-case basis, and to consider that PCI may in fact be the preferred option for some high-risk patients.
“Although CABG may generally be preferred, there are individual clinical situations in which PCI may be a reasonable alternative,” they wrote. “For example, it might be preferred for patients at high risk for perioperative stroke or whose long-term survival is compromised because of noncardiac factors.”
The study’s primary source of funding was the Fonds de recherche du Québec-Santé, but none of the researchers reported relevant financial disclosures.
FROM THE ANNALS OF INTERNAL MEDICINE
Key clinical point: CABG is the preferred long-term revascularization technique for diabetics.
Major finding: Diabetic patients with multivessel disease or left main coronary artery disease who underwent PCI were 33% more likely to experience stroke, nonfatal heart attack, or death than were those who underwent CABG.
Data source: Bayesian network meta-analysis of 40 studies.
Disclosures: The authors disclosed that the study’s primary source of funding was the Fonds de recherche du Québec-Santé but reported no other relevant financial disclosures.
Mali added to list of countries for enhanced Ebola screening
Travelers arriving in the United States from the West African nation of Mali will be subject to the same enhanced entry screening as are those coming from the Ebola-stricken countries of Liberia, Sierra Leone, and Guinea, because of a rise in the number of confirmed Ebola cases within Mali.
The measure, announced Nov. 16 by the Centers for Disease Control and Prevention and the Department of Homeland Security, take effect Monday, Nov. 17. In a written statement, both agencies noted that while there are no direct flights from Mali to the United States, an average of 15-20 passengers per day – most of whom are U.S. citizens or lawful permanent residents – begin their flight itineraries in Mali with the United States as their eventual destination.
Enhanced entry screenings began in October in an effort to track potential Ebola cases within the United States before they spread. Once passengers land, health officials will collect contact information for all passengers and their friends or relatives in the United States for monitoring purposes. Travelers will then be required to check in with local health agencies every day to report their temperature and any flulike symptoms, and will have to coordinate with the relevant public health officials if they plan to do any additional traveling within the country. If travelers are free of symptoms for 21 days, they are no longer at risk of having or spreading the Ebola virus.
“For ease of administration, we will work with the airlines to ensure rerouting for the few travelers from Mali not already scheduled to land at one of the five airports in the United States (New York JFK, Newark, Washington-Dulles, Chicago-O’Hare, and Atlanta Hartsfield- Jackson) already performing screening on passengers from the other affected West African nations,” the agencies said in the statement.
The first confirmed case of Ebola in the United States was Thomas Eric Duncan, who was diagnosed in Dallas on Sept. 30 and became the country’s first Ebola casualty on Oct. 8. On Monday, Nov. 17, U.S. permanent resident Dr. Martin Salia died at Nebraska Medical Center after working with Ebola patients in Sierra Leone.
Travelers arriving in the United States from the West African nation of Mali will be subject to the same enhanced entry screening as are those coming from the Ebola-stricken countries of Liberia, Sierra Leone, and Guinea, because of a rise in the number of confirmed Ebola cases within Mali.
The measure, announced Nov. 16 by the Centers for Disease Control and Prevention and the Department of Homeland Security, take effect Monday, Nov. 17. In a written statement, both agencies noted that while there are no direct flights from Mali to the United States, an average of 15-20 passengers per day – most of whom are U.S. citizens or lawful permanent residents – begin their flight itineraries in Mali with the United States as their eventual destination.
Enhanced entry screenings began in October in an effort to track potential Ebola cases within the United States before they spread. Once passengers land, health officials will collect contact information for all passengers and their friends or relatives in the United States for monitoring purposes. Travelers will then be required to check in with local health agencies every day to report their temperature and any flulike symptoms, and will have to coordinate with the relevant public health officials if they plan to do any additional traveling within the country. If travelers are free of symptoms for 21 days, they are no longer at risk of having or spreading the Ebola virus.
“For ease of administration, we will work with the airlines to ensure rerouting for the few travelers from Mali not already scheduled to land at one of the five airports in the United States (New York JFK, Newark, Washington-Dulles, Chicago-O’Hare, and Atlanta Hartsfield- Jackson) already performing screening on passengers from the other affected West African nations,” the agencies said in the statement.
The first confirmed case of Ebola in the United States was Thomas Eric Duncan, who was diagnosed in Dallas on Sept. 30 and became the country’s first Ebola casualty on Oct. 8. On Monday, Nov. 17, U.S. permanent resident Dr. Martin Salia died at Nebraska Medical Center after working with Ebola patients in Sierra Leone.
Travelers arriving in the United States from the West African nation of Mali will be subject to the same enhanced entry screening as are those coming from the Ebola-stricken countries of Liberia, Sierra Leone, and Guinea, because of a rise in the number of confirmed Ebola cases within Mali.
The measure, announced Nov. 16 by the Centers for Disease Control and Prevention and the Department of Homeland Security, take effect Monday, Nov. 17. In a written statement, both agencies noted that while there are no direct flights from Mali to the United States, an average of 15-20 passengers per day – most of whom are U.S. citizens or lawful permanent residents – begin their flight itineraries in Mali with the United States as their eventual destination.
Enhanced entry screenings began in October in an effort to track potential Ebola cases within the United States before they spread. Once passengers land, health officials will collect contact information for all passengers and their friends or relatives in the United States for monitoring purposes. Travelers will then be required to check in with local health agencies every day to report their temperature and any flulike symptoms, and will have to coordinate with the relevant public health officials if they plan to do any additional traveling within the country. If travelers are free of symptoms for 21 days, they are no longer at risk of having or spreading the Ebola virus.
“For ease of administration, we will work with the airlines to ensure rerouting for the few travelers from Mali not already scheduled to land at one of the five airports in the United States (New York JFK, Newark, Washington-Dulles, Chicago-O’Hare, and Atlanta Hartsfield- Jackson) already performing screening on passengers from the other affected West African nations,” the agencies said in the statement.
The first confirmed case of Ebola in the United States was Thomas Eric Duncan, who was diagnosed in Dallas on Sept. 30 and became the country’s first Ebola casualty on Oct. 8. On Monday, Nov. 17, U.S. permanent resident Dr. Martin Salia died at Nebraska Medical Center after working with Ebola patients in Sierra Leone.
Stress More Dangerous for Women With Heart Disease Than for Men
Mental stress triggered significantly more myocardial ischemia in younger women with stable coronary heart disease than in younger men with CHD, according to a study presented at the American Heart Association’s Scientific Sessions.
However, physical stress generated little or no difference in blood flow to the heart between women and men with stable coronary heart disease (CHD).
“Women who develop heart disease at a younger age make up a special high-risk group, because they are disproportionally vulnerable to emotional stress,” said Dr. Viola Vaccarino, chair of cardiovascular research and epidemiology at the Rollins School of Public Health at Emory University, Atlanta, and her colleagues in a statement.
In a population-based study, Dr. Vaccarino and her associates analyzed 534 patients (including 151 women) aged 38-79 years. All patients had CHD, which was verified by their medical records, and were divided into three age-based groups: 55 years or younger, 56-64 years, and 65 years or older.
Men and women in each age group underwent two stress tests, one mental and one physical, along with a third test while at rest. The mental test consisted of giving a public address about a real-life stressful scenario, while the physical test was a treadmill exercise conducted in accordance with the Bruce protocol. Patients who were unable to achieve the target heart rate in the physical test underwent pharmacological stress with regadenoson.
During each test, patients’ heart rates and blood pressures were monitored, from which total severity scores were derived by quantifying myocardial perfusion deficit (size and severity) across 17 myocardial segments. Total ischemic perfusion deficits (IPD) were calculated by taking the difference between a patient’s severity score at rest and severity score during mental/physical stress.
Overall, women had greater IPD with mental stress than men did, and younger age exacerbated the difference.
Women aged 55 years or younger experiencing 3.5 times the mean IPD as did men in the same age group (139 vs. 40; P < .0001). Mean IPD was 1.8 times greater in women aged 56-64 years than in men (101 vs. 56; P = .03), while mean IPD in women 65 years or older was almost identical to that of their male counterparts (69 vs. 61; P = 0.5).
Conversely, physical stress yielded no significant difference between men and women across all three age groups. Those aged 55 years or younger showed the largest, albeit nonsignificant, differences – with women having 1.5 times more IPD under physical stress than similarly aged men (123 vs. 84; P = .22). In the other two age groups, however, differences between women and men were minimal and nonsignificant.
“Health care providers should be aware of young and middle-age women’s special vulnerability to stress, and ask the questions about psychological stress that often don’t get asked,” the authors advised. “If they note that their patient is under psychological stress or is depressed, they should advise the woman to get relevant help or support from mental health providers, stress-reduction programs, or other means.”
The National Heart, Lung, and Blood Institute funded the study. Dr. Vaccarino had no relevant disclosures.
Mental stress triggered significantly more myocardial ischemia in younger women with stable coronary heart disease than in younger men with CHD, according to a study presented at the American Heart Association’s Scientific Sessions.
However, physical stress generated little or no difference in blood flow to the heart between women and men with stable coronary heart disease (CHD).
“Women who develop heart disease at a younger age make up a special high-risk group, because they are disproportionally vulnerable to emotional stress,” said Dr. Viola Vaccarino, chair of cardiovascular research and epidemiology at the Rollins School of Public Health at Emory University, Atlanta, and her colleagues in a statement.
In a population-based study, Dr. Vaccarino and her associates analyzed 534 patients (including 151 women) aged 38-79 years. All patients had CHD, which was verified by their medical records, and were divided into three age-based groups: 55 years or younger, 56-64 years, and 65 years or older.
Men and women in each age group underwent two stress tests, one mental and one physical, along with a third test while at rest. The mental test consisted of giving a public address about a real-life stressful scenario, while the physical test was a treadmill exercise conducted in accordance with the Bruce protocol. Patients who were unable to achieve the target heart rate in the physical test underwent pharmacological stress with regadenoson.
During each test, patients’ heart rates and blood pressures were monitored, from which total severity scores were derived by quantifying myocardial perfusion deficit (size and severity) across 17 myocardial segments. Total ischemic perfusion deficits (IPD) were calculated by taking the difference between a patient’s severity score at rest and severity score during mental/physical stress.
Overall, women had greater IPD with mental stress than men did, and younger age exacerbated the difference.
Women aged 55 years or younger experiencing 3.5 times the mean IPD as did men in the same age group (139 vs. 40; P < .0001). Mean IPD was 1.8 times greater in women aged 56-64 years than in men (101 vs. 56; P = .03), while mean IPD in women 65 years or older was almost identical to that of their male counterparts (69 vs. 61; P = 0.5).
Conversely, physical stress yielded no significant difference between men and women across all three age groups. Those aged 55 years or younger showed the largest, albeit nonsignificant, differences – with women having 1.5 times more IPD under physical stress than similarly aged men (123 vs. 84; P = .22). In the other two age groups, however, differences between women and men were minimal and nonsignificant.
“Health care providers should be aware of young and middle-age women’s special vulnerability to stress, and ask the questions about psychological stress that often don’t get asked,” the authors advised. “If they note that their patient is under psychological stress or is depressed, they should advise the woman to get relevant help or support from mental health providers, stress-reduction programs, or other means.”
The National Heart, Lung, and Blood Institute funded the study. Dr. Vaccarino had no relevant disclosures.
Mental stress triggered significantly more myocardial ischemia in younger women with stable coronary heart disease than in younger men with CHD, according to a study presented at the American Heart Association’s Scientific Sessions.
However, physical stress generated little or no difference in blood flow to the heart between women and men with stable coronary heart disease (CHD).
“Women who develop heart disease at a younger age make up a special high-risk group, because they are disproportionally vulnerable to emotional stress,” said Dr. Viola Vaccarino, chair of cardiovascular research and epidemiology at the Rollins School of Public Health at Emory University, Atlanta, and her colleagues in a statement.
In a population-based study, Dr. Vaccarino and her associates analyzed 534 patients (including 151 women) aged 38-79 years. All patients had CHD, which was verified by their medical records, and were divided into three age-based groups: 55 years or younger, 56-64 years, and 65 years or older.
Men and women in each age group underwent two stress tests, one mental and one physical, along with a third test while at rest. The mental test consisted of giving a public address about a real-life stressful scenario, while the physical test was a treadmill exercise conducted in accordance with the Bruce protocol. Patients who were unable to achieve the target heart rate in the physical test underwent pharmacological stress with regadenoson.
During each test, patients’ heart rates and blood pressures were monitored, from which total severity scores were derived by quantifying myocardial perfusion deficit (size and severity) across 17 myocardial segments. Total ischemic perfusion deficits (IPD) were calculated by taking the difference between a patient’s severity score at rest and severity score during mental/physical stress.
Overall, women had greater IPD with mental stress than men did, and younger age exacerbated the difference.
Women aged 55 years or younger experiencing 3.5 times the mean IPD as did men in the same age group (139 vs. 40; P < .0001). Mean IPD was 1.8 times greater in women aged 56-64 years than in men (101 vs. 56; P = .03), while mean IPD in women 65 years or older was almost identical to that of their male counterparts (69 vs. 61; P = 0.5).
Conversely, physical stress yielded no significant difference between men and women across all three age groups. Those aged 55 years or younger showed the largest, albeit nonsignificant, differences – with women having 1.5 times more IPD under physical stress than similarly aged men (123 vs. 84; P = .22). In the other two age groups, however, differences between women and men were minimal and nonsignificant.
“Health care providers should be aware of young and middle-age women’s special vulnerability to stress, and ask the questions about psychological stress that often don’t get asked,” the authors advised. “If they note that their patient is under psychological stress or is depressed, they should advise the woman to get relevant help or support from mental health providers, stress-reduction programs, or other means.”
The National Heart, Lung, and Blood Institute funded the study. Dr. Vaccarino had no relevant disclosures.
Stress more dangerous for women with heart disease than for men
Mental stress triggered significantly more myocardial ischemia in younger women with stable coronary heart disease than in younger men with CHD, according to a study presented at the American Heart Association’s Scientific Sessions.
However, physical stress generated little or no difference in blood flow to the heart between women and men with stable coronary heart disease (CHD).
“Women who develop heart disease at a younger age make up a special high-risk group, because they are disproportionally vulnerable to emotional stress,” said Dr. Viola Vaccarino, chair of cardiovascular research and epidemiology at the Rollins School of Public Health at Emory University, Atlanta, and her colleagues in a statement.
In a population-based study, Dr. Vaccarino and her associates analyzed 534 patients (including 151 women) aged 38-79 years. All patients had CHD, which was verified by their medical records, and were divided into three age-based groups: 55 years or younger, 56-64 years, and 65 years or older.
Men and women in each age group underwent two stress tests, one mental and one physical, along with a third test while at rest. The mental test consisted of giving a public address about a real-life stressful scenario, while the physical test was a treadmill exercise conducted in accordance with the Bruce protocol. Patients who were unable to achieve the target heart rate in the physical test underwent pharmacological stress with regadenoson.
During each test, patients’ heart rates and blood pressures were monitored, from which total severity scores were derived by quantifying myocardial perfusion deficit (size and severity) across 17 myocardial segments. Total ischemic perfusion deficits (IPD) were calculated by taking the difference between a patient’s severity score at rest and severity score during mental/physical stress.
Overall, women had greater IPD with mental stress than men did, and younger age exacerbated the difference.
Women aged 55 years or younger experiencing 3.5 times the mean IPD as did men in the same age group (139 vs. 40; P < .0001). Mean IPD was 1.8 times greater in women aged 56-64 years than in men (101 vs. 56; P = .03), while mean IPD in women 65 years or older was almost identical to that of their male counterparts (69 vs. 61; P = 0.5).
Conversely, physical stress yielded no significant difference between men and women across all three age groups. Those aged 55 years or younger showed the largest, albeit nonsignificant, differences – with women having 1.5 times more IPD under physical stress than similarly aged men (123 vs. 84; P = .22). In the other two age groups, however, differences between women and men were minimal and nonsignificant.
“Health care providers should be aware of young and middle-age women’s special vulnerability to stress, and ask the questions about psychological stress that often don’t get asked,” the authors advised. “If they note that their patient is under psychological stress or is depressed, they should advise the woman to get relevant help or support from mental health providers, stress-reduction programs, or other means.”
The National Heart, Lung, and Blood Institute funded the study. Dr. Vaccarino had no relevant disclosures.
Mental stress triggered significantly more myocardial ischemia in younger women with stable coronary heart disease than in younger men with CHD, according to a study presented at the American Heart Association’s Scientific Sessions.
However, physical stress generated little or no difference in blood flow to the heart between women and men with stable coronary heart disease (CHD).
“Women who develop heart disease at a younger age make up a special high-risk group, because they are disproportionally vulnerable to emotional stress,” said Dr. Viola Vaccarino, chair of cardiovascular research and epidemiology at the Rollins School of Public Health at Emory University, Atlanta, and her colleagues in a statement.
In a population-based study, Dr. Vaccarino and her associates analyzed 534 patients (including 151 women) aged 38-79 years. All patients had CHD, which was verified by their medical records, and were divided into three age-based groups: 55 years or younger, 56-64 years, and 65 years or older.
Men and women in each age group underwent two stress tests, one mental and one physical, along with a third test while at rest. The mental test consisted of giving a public address about a real-life stressful scenario, while the physical test was a treadmill exercise conducted in accordance with the Bruce protocol. Patients who were unable to achieve the target heart rate in the physical test underwent pharmacological stress with regadenoson.
During each test, patients’ heart rates and blood pressures were monitored, from which total severity scores were derived by quantifying myocardial perfusion deficit (size and severity) across 17 myocardial segments. Total ischemic perfusion deficits (IPD) were calculated by taking the difference between a patient’s severity score at rest and severity score during mental/physical stress.
Overall, women had greater IPD with mental stress than men did, and younger age exacerbated the difference.
Women aged 55 years or younger experiencing 3.5 times the mean IPD as did men in the same age group (139 vs. 40; P < .0001). Mean IPD was 1.8 times greater in women aged 56-64 years than in men (101 vs. 56; P = .03), while mean IPD in women 65 years or older was almost identical to that of their male counterparts (69 vs. 61; P = 0.5).
Conversely, physical stress yielded no significant difference between men and women across all three age groups. Those aged 55 years or younger showed the largest, albeit nonsignificant, differences – with women having 1.5 times more IPD under physical stress than similarly aged men (123 vs. 84; P = .22). In the other two age groups, however, differences between women and men were minimal and nonsignificant.
“Health care providers should be aware of young and middle-age women’s special vulnerability to stress, and ask the questions about psychological stress that often don’t get asked,” the authors advised. “If they note that their patient is under psychological stress or is depressed, they should advise the woman to get relevant help or support from mental health providers, stress-reduction programs, or other means.”
The National Heart, Lung, and Blood Institute funded the study. Dr. Vaccarino had no relevant disclosures.
Mental stress triggered significantly more myocardial ischemia in younger women with stable coronary heart disease than in younger men with CHD, according to a study presented at the American Heart Association’s Scientific Sessions.
However, physical stress generated little or no difference in blood flow to the heart between women and men with stable coronary heart disease (CHD).
“Women who develop heart disease at a younger age make up a special high-risk group, because they are disproportionally vulnerable to emotional stress,” said Dr. Viola Vaccarino, chair of cardiovascular research and epidemiology at the Rollins School of Public Health at Emory University, Atlanta, and her colleagues in a statement.
In a population-based study, Dr. Vaccarino and her associates analyzed 534 patients (including 151 women) aged 38-79 years. All patients had CHD, which was verified by their medical records, and were divided into three age-based groups: 55 years or younger, 56-64 years, and 65 years or older.
Men and women in each age group underwent two stress tests, one mental and one physical, along with a third test while at rest. The mental test consisted of giving a public address about a real-life stressful scenario, while the physical test was a treadmill exercise conducted in accordance with the Bruce protocol. Patients who were unable to achieve the target heart rate in the physical test underwent pharmacological stress with regadenoson.
During each test, patients’ heart rates and blood pressures were monitored, from which total severity scores were derived by quantifying myocardial perfusion deficit (size and severity) across 17 myocardial segments. Total ischemic perfusion deficits (IPD) were calculated by taking the difference between a patient’s severity score at rest and severity score during mental/physical stress.
Overall, women had greater IPD with mental stress than men did, and younger age exacerbated the difference.
Women aged 55 years or younger experiencing 3.5 times the mean IPD as did men in the same age group (139 vs. 40; P < .0001). Mean IPD was 1.8 times greater in women aged 56-64 years than in men (101 vs. 56; P = .03), while mean IPD in women 65 years or older was almost identical to that of their male counterparts (69 vs. 61; P = 0.5).
Conversely, physical stress yielded no significant difference between men and women across all three age groups. Those aged 55 years or younger showed the largest, albeit nonsignificant, differences – with women having 1.5 times more IPD under physical stress than similarly aged men (123 vs. 84; P = .22). In the other two age groups, however, differences between women and men were minimal and nonsignificant.
“Health care providers should be aware of young and middle-age women’s special vulnerability to stress, and ask the questions about psychological stress that often don’t get asked,” the authors advised. “If they note that their patient is under psychological stress or is depressed, they should advise the woman to get relevant help or support from mental health providers, stress-reduction programs, or other means.”
The National Heart, Lung, and Blood Institute funded the study. Dr. Vaccarino had no relevant disclosures.
Key clinical point: Younger women with stable coronary heart disease suffer greater stress-induced myocardial ischemia than men do.
Major finding: After mental stress, ischemic perfusion deficits were 3.5 times greater in women aged 55 years or younger with CHD than in men of similar age with CHD.
Data source: A population-based study.
Disclosures: The National Heart, Lung, and Blood Institute funded the study. Dr. Vaccarino had no relevant disclosures.
CDC: Contact Lens Wearers Must Be Careful to Avoid Risk, Burden of Keratitis
Nearly 1 million visits to physicians’ offices, outpatient clinics, and emergency departments occur every year that are directly related to keratitis or other contact lens–related disorders, at a cost of millions of dollars in direct health care expenditures, according to a new study released Nov. 14 by the Centers for Disease Control and Prevention.
“Here at CDC, we’ve long suspected that keratitis poses a significant burden on Americans’ health and on our health care system, both in individual cases and in periodic, multistate outbreaks associated with contact lens wearers,” Dr. Jennifer Cope, a CDC medical epidemiologist, said in a telebriefing. “But until now, we didn’t have any estimates as to how large that burden might be.”
According to an analysis of 2010 data from three national ambulatory care and emergency department databases, the CDC estimated that keratitis and other contact lens–related disorders cost Americans $175 million in direct healthcare costs each year, including $58 million for Medicare patients and $12 million for Medicaid patients. Of the nearly 1 million healthcare visits related to keratitis and other contact lens–related disorders, roughly 930,000 were to physicians’ offices and outpatient clinics, while around 58,000 were to emergency departments, ultimately accounting for 250,000 clinician hours yearly (MMWR 2014:63:1027-30).
The CDC study indicated that 76.5% of keratitis-related health care visits every year result in prescriptions for antimicrobial agents. Separately, another roughly 230,000 visits to clinicians’ offices or outpatient clinics for corneal disorders every year are related to use of contact lenses, and 70% of those require antimicrobial prescriptions. In an effort to increase public awareness of contact lens disorders ahead of its “Contact Lens Health Week,” which runs from Nov. 17 to 21, the CDC is warning the nearly 38 million American contact lens wearers to practice good lens hygiene to ensure that they do not develop keratitis, which is known to lead to partial or complete loss of vision in many cases.
“Wearing contacts and not taking care of them properly is the single largest risk factor for [developing] keratitis,” said Dr. Cope. “Some bad habits, like sleeping in your contact lenses, failing to clean and replace your storage case frequently, and letting contact lenses get in water – whether you’re swimming, showering, or rinsing the lenses in water instead of in contact lens solution – greatly increase a person’s risk for developing keratitis.”
The CDC recommends washing your hands with soap and water before putting contact lenses into your eyes in order to mitigate the chances of introducing bacteria into your eyes while touching them, putting fresh contact lens solution into your storage case daily, replacing contact lens cases every 3 months, and never sleeping in your contact lenses unless explicitly advised to by your doctor.
“People who wear their contact lenses overnight are more than 20 times more likely to get keratitis,” Dr. Cope noted, adding that “Contact lenses can provide many benefits, but they are not risk-free.”
More information on the CDC’s contact lens “Wear and Care” guidelines can be found at the CDC website.
Nearly 1 million visits to physicians’ offices, outpatient clinics, and emergency departments occur every year that are directly related to keratitis or other contact lens–related disorders, at a cost of millions of dollars in direct health care expenditures, according to a new study released Nov. 14 by the Centers for Disease Control and Prevention.
“Here at CDC, we’ve long suspected that keratitis poses a significant burden on Americans’ health and on our health care system, both in individual cases and in periodic, multistate outbreaks associated with contact lens wearers,” Dr. Jennifer Cope, a CDC medical epidemiologist, said in a telebriefing. “But until now, we didn’t have any estimates as to how large that burden might be.”
According to an analysis of 2010 data from three national ambulatory care and emergency department databases, the CDC estimated that keratitis and other contact lens–related disorders cost Americans $175 million in direct healthcare costs each year, including $58 million for Medicare patients and $12 million for Medicaid patients. Of the nearly 1 million healthcare visits related to keratitis and other contact lens–related disorders, roughly 930,000 were to physicians’ offices and outpatient clinics, while around 58,000 were to emergency departments, ultimately accounting for 250,000 clinician hours yearly (MMWR 2014:63:1027-30).
The CDC study indicated that 76.5% of keratitis-related health care visits every year result in prescriptions for antimicrobial agents. Separately, another roughly 230,000 visits to clinicians’ offices or outpatient clinics for corneal disorders every year are related to use of contact lenses, and 70% of those require antimicrobial prescriptions. In an effort to increase public awareness of contact lens disorders ahead of its “Contact Lens Health Week,” which runs from Nov. 17 to 21, the CDC is warning the nearly 38 million American contact lens wearers to practice good lens hygiene to ensure that they do not develop keratitis, which is known to lead to partial or complete loss of vision in many cases.
“Wearing contacts and not taking care of them properly is the single largest risk factor for [developing] keratitis,” said Dr. Cope. “Some bad habits, like sleeping in your contact lenses, failing to clean and replace your storage case frequently, and letting contact lenses get in water – whether you’re swimming, showering, or rinsing the lenses in water instead of in contact lens solution – greatly increase a person’s risk for developing keratitis.”
The CDC recommends washing your hands with soap and water before putting contact lenses into your eyes in order to mitigate the chances of introducing bacteria into your eyes while touching them, putting fresh contact lens solution into your storage case daily, replacing contact lens cases every 3 months, and never sleeping in your contact lenses unless explicitly advised to by your doctor.
“People who wear their contact lenses overnight are more than 20 times more likely to get keratitis,” Dr. Cope noted, adding that “Contact lenses can provide many benefits, but they are not risk-free.”
More information on the CDC’s contact lens “Wear and Care” guidelines can be found at the CDC website.
Nearly 1 million visits to physicians’ offices, outpatient clinics, and emergency departments occur every year that are directly related to keratitis or other contact lens–related disorders, at a cost of millions of dollars in direct health care expenditures, according to a new study released Nov. 14 by the Centers for Disease Control and Prevention.
“Here at CDC, we’ve long suspected that keratitis poses a significant burden on Americans’ health and on our health care system, both in individual cases and in periodic, multistate outbreaks associated with contact lens wearers,” Dr. Jennifer Cope, a CDC medical epidemiologist, said in a telebriefing. “But until now, we didn’t have any estimates as to how large that burden might be.”
According to an analysis of 2010 data from three national ambulatory care and emergency department databases, the CDC estimated that keratitis and other contact lens–related disorders cost Americans $175 million in direct healthcare costs each year, including $58 million for Medicare patients and $12 million for Medicaid patients. Of the nearly 1 million healthcare visits related to keratitis and other contact lens–related disorders, roughly 930,000 were to physicians’ offices and outpatient clinics, while around 58,000 were to emergency departments, ultimately accounting for 250,000 clinician hours yearly (MMWR 2014:63:1027-30).
The CDC study indicated that 76.5% of keratitis-related health care visits every year result in prescriptions for antimicrobial agents. Separately, another roughly 230,000 visits to clinicians’ offices or outpatient clinics for corneal disorders every year are related to use of contact lenses, and 70% of those require antimicrobial prescriptions. In an effort to increase public awareness of contact lens disorders ahead of its “Contact Lens Health Week,” which runs from Nov. 17 to 21, the CDC is warning the nearly 38 million American contact lens wearers to practice good lens hygiene to ensure that they do not develop keratitis, which is known to lead to partial or complete loss of vision in many cases.
“Wearing contacts and not taking care of them properly is the single largest risk factor for [developing] keratitis,” said Dr. Cope. “Some bad habits, like sleeping in your contact lenses, failing to clean and replace your storage case frequently, and letting contact lenses get in water – whether you’re swimming, showering, or rinsing the lenses in water instead of in contact lens solution – greatly increase a person’s risk for developing keratitis.”
The CDC recommends washing your hands with soap and water before putting contact lenses into your eyes in order to mitigate the chances of introducing bacteria into your eyes while touching them, putting fresh contact lens solution into your storage case daily, replacing contact lens cases every 3 months, and never sleeping in your contact lenses unless explicitly advised to by your doctor.
“People who wear their contact lenses overnight are more than 20 times more likely to get keratitis,” Dr. Cope noted, adding that “Contact lenses can provide many benefits, but they are not risk-free.”
More information on the CDC’s contact lens “Wear and Care” guidelines can be found at the CDC website.
FROM MMWR
CDC: Contact lens wearers must be careful to avoid risk, burden of keratitis
Nearly 1 million visits to physicians’ offices, outpatient clinics, and emergency departments occur every year that are directly related to keratitis or other contact lens–related disorders, at a cost of millions of dollars in direct health care expenditures, according to a new study released Nov. 14 by the Centers for Disease Control and Prevention.
“Here at CDC, we’ve long suspected that keratitis poses a significant burden on Americans’ health and on our health care system, both in individual cases and in periodic, multistate outbreaks associated with contact lens wearers,” Dr. Jennifer Cope, a CDC medical epidemiologist, said in a telebriefing. “But until now, we didn’t have any estimates as to how large that burden might be.”
According to an analysis of 2010 data from three national ambulatory care and emergency department databases, the CDC estimated that keratitis and other contact lens–related disorders cost Americans $175 million in direct healthcare costs each year, including $58 million for Medicare patients and $12 million for Medicaid patients. Of the nearly 1 million healthcare visits related to keratitis and other contact lens–related disorders, roughly 930,000 were to physicians’ offices and outpatient clinics, while around 58,000 were to emergency departments, ultimately accounting for 250,000 clinician hours yearly (MMWR 2014:63:1027-30).
The CDC study indicated that 76.5% of keratitis-related health care visits every year result in prescriptions for antimicrobial agents. Separately, another roughly 230,000 visits to clinicians’ offices or outpatient clinics for corneal disorders every year are related to use of contact lenses, and 70% of those require antimicrobial prescriptions. In an effort to increase public awareness of contact lens disorders ahead of its “Contact Lens Health Week,” which runs from Nov. 17 to 21, the CDC is warning the nearly 38 million American contact lens wearers to practice good lens hygiene to ensure that they do not develop keratitis, which is known to lead to partial or complete loss of vision in many cases.
“Wearing contacts and not taking care of them properly is the single largest risk factor for [developing] keratitis,” said Dr. Cope. “Some bad habits, like sleeping in your contact lenses, failing to clean and replace your storage case frequently, and letting contact lenses get in water – whether you’re swimming, showering, or rinsing the lenses in water instead of in contact lens solution – greatly increase a person’s risk for developing keratitis.”
The CDC recommends washing your hands with soap and water before putting contact lenses into your eyes in order to mitigate the chances of introducing bacteria into your eyes while touching them, putting fresh contact lens solution into your storage case daily, replacing contact lens cases every 3 months, and never sleeping in your contact lenses unless explicitly advised to by your doctor.
“People who wear their contact lenses overnight are more than 20 times more likely to get keratitis,” Dr. Cope noted, adding that “Contact lenses can provide many benefits, but they are not risk-free.”
More information on the CDC’s contact lens “Wear and Care” guidelines can be found at the CDC website.
Nearly 1 million visits to physicians’ offices, outpatient clinics, and emergency departments occur every year that are directly related to keratitis or other contact lens–related disorders, at a cost of millions of dollars in direct health care expenditures, according to a new study released Nov. 14 by the Centers for Disease Control and Prevention.
“Here at CDC, we’ve long suspected that keratitis poses a significant burden on Americans’ health and on our health care system, both in individual cases and in periodic, multistate outbreaks associated with contact lens wearers,” Dr. Jennifer Cope, a CDC medical epidemiologist, said in a telebriefing. “But until now, we didn’t have any estimates as to how large that burden might be.”
According to an analysis of 2010 data from three national ambulatory care and emergency department databases, the CDC estimated that keratitis and other contact lens–related disorders cost Americans $175 million in direct healthcare costs each year, including $58 million for Medicare patients and $12 million for Medicaid patients. Of the nearly 1 million healthcare visits related to keratitis and other contact lens–related disorders, roughly 930,000 were to physicians’ offices and outpatient clinics, while around 58,000 were to emergency departments, ultimately accounting for 250,000 clinician hours yearly (MMWR 2014:63:1027-30).
The CDC study indicated that 76.5% of keratitis-related health care visits every year result in prescriptions for antimicrobial agents. Separately, another roughly 230,000 visits to clinicians’ offices or outpatient clinics for corneal disorders every year are related to use of contact lenses, and 70% of those require antimicrobial prescriptions. In an effort to increase public awareness of contact lens disorders ahead of its “Contact Lens Health Week,” which runs from Nov. 17 to 21, the CDC is warning the nearly 38 million American contact lens wearers to practice good lens hygiene to ensure that they do not develop keratitis, which is known to lead to partial or complete loss of vision in many cases.
“Wearing contacts and not taking care of them properly is the single largest risk factor for [developing] keratitis,” said Dr. Cope. “Some bad habits, like sleeping in your contact lenses, failing to clean and replace your storage case frequently, and letting contact lenses get in water – whether you’re swimming, showering, or rinsing the lenses in water instead of in contact lens solution – greatly increase a person’s risk for developing keratitis.”
The CDC recommends washing your hands with soap and water before putting contact lenses into your eyes in order to mitigate the chances of introducing bacteria into your eyes while touching them, putting fresh contact lens solution into your storage case daily, replacing contact lens cases every 3 months, and never sleeping in your contact lenses unless explicitly advised to by your doctor.
“People who wear their contact lenses overnight are more than 20 times more likely to get keratitis,” Dr. Cope noted, adding that “Contact lenses can provide many benefits, but they are not risk-free.”
More information on the CDC’s contact lens “Wear and Care” guidelines can be found at the CDC website.
Nearly 1 million visits to physicians’ offices, outpatient clinics, and emergency departments occur every year that are directly related to keratitis or other contact lens–related disorders, at a cost of millions of dollars in direct health care expenditures, according to a new study released Nov. 14 by the Centers for Disease Control and Prevention.
“Here at CDC, we’ve long suspected that keratitis poses a significant burden on Americans’ health and on our health care system, both in individual cases and in periodic, multistate outbreaks associated with contact lens wearers,” Dr. Jennifer Cope, a CDC medical epidemiologist, said in a telebriefing. “But until now, we didn’t have any estimates as to how large that burden might be.”
According to an analysis of 2010 data from three national ambulatory care and emergency department databases, the CDC estimated that keratitis and other contact lens–related disorders cost Americans $175 million in direct healthcare costs each year, including $58 million for Medicare patients and $12 million for Medicaid patients. Of the nearly 1 million healthcare visits related to keratitis and other contact lens–related disorders, roughly 930,000 were to physicians’ offices and outpatient clinics, while around 58,000 were to emergency departments, ultimately accounting for 250,000 clinician hours yearly (MMWR 2014:63:1027-30).
The CDC study indicated that 76.5% of keratitis-related health care visits every year result in prescriptions for antimicrobial agents. Separately, another roughly 230,000 visits to clinicians’ offices or outpatient clinics for corneal disorders every year are related to use of contact lenses, and 70% of those require antimicrobial prescriptions. In an effort to increase public awareness of contact lens disorders ahead of its “Contact Lens Health Week,” which runs from Nov. 17 to 21, the CDC is warning the nearly 38 million American contact lens wearers to practice good lens hygiene to ensure that they do not develop keratitis, which is known to lead to partial or complete loss of vision in many cases.
“Wearing contacts and not taking care of them properly is the single largest risk factor for [developing] keratitis,” said Dr. Cope. “Some bad habits, like sleeping in your contact lenses, failing to clean and replace your storage case frequently, and letting contact lenses get in water – whether you’re swimming, showering, or rinsing the lenses in water instead of in contact lens solution – greatly increase a person’s risk for developing keratitis.”
The CDC recommends washing your hands with soap and water before putting contact lenses into your eyes in order to mitigate the chances of introducing bacteria into your eyes while touching them, putting fresh contact lens solution into your storage case daily, replacing contact lens cases every 3 months, and never sleeping in your contact lenses unless explicitly advised to by your doctor.
“People who wear their contact lenses overnight are more than 20 times more likely to get keratitis,” Dr. Cope noted, adding that “Contact lenses can provide many benefits, but they are not risk-free.”
More information on the CDC’s contact lens “Wear and Care” guidelines can be found at the CDC website.
FROM MMWR