User login
Prehospitalization Steroids Don't Reduce Risk of Acute Lung Injury
HOUSTON – Prehospital use of systemic corticosteroids does not prevent the development of acute lung injury in at-risk patients, according to data reported at the annual congress of the Society of Critical Care Medicine.
In the first study to specifically evaluate the prophylactic value of prehospital systemic corticosteroids in patients with at least one risk factor for acute lung injury (ALI), Dr. Lioudmila Karnatovskaia of the Mayo Clinic in Jacksonville, Fla., and colleagues found a statistically similar incidence of ALI among at-risk patients who were and were not taking systemic corticosteroids at the time of hospitalization.
The investigators also determined that prehospital use of systemic corticosteroids did not affect the need for mechanical ventilation or overall mortality – a finding that appears to contradict previous studies that have linked preventive steroids in at-risk patients with increased rates of ALI and acute respiratory distress syndrome, Dr. Karnatovskaia said.
The study was a planned exploratory subgroup analysis of the Lung Injury Prediction Score cohort of the U.S. Critical Illness and Injury Trials Group, which prospectively enrolled 5,584 patients who were admitted to 22 acute care hospitals and who had predisposing conditions for ALI, including sepsis, shock, pancreatitis, pneumonia, aspiration, high-risk trauma, and high-risk surgery. The primary outcome was the development of ALI, and secondary outcomes were need for invasive ventilation and ICU and hospital mortality, Dr. Karnatovskaia said, noting that the data were analyzed using univariate, logistic regression, and propensity score–based analyses.
For the propensity analysis, "the propensity score balanced all of the covariates. Of the 458 patients on systemic corticosteroids, 443 were matched up to 1:4 to those not on systemic corticosteroids, for a total of 1,332 matched patients," she said. "We calculated adjusted risk for acute lung injury, invasive ventilation, and in-hospital mortality from the propensity score–matched sample using a conditional logistic regression model."
Of the 5,584 patients, 458 were on systemic corticosteroids at the time of hospitalization and 5,126 were not. Among the systemic corticosteroid group, 34 (7.4%) developed ALI, compared with 343 (6.7%) of those not taking them, Dr. Karnatovskaia reported. In the systemic corticosteroid group, 104 patients (23%) required mechanical ventilation and 35 patients (8%) died, compared with 1,752 (34%) and 172 (3%) of those not taking systemic corticosteroids, she said.
On univariate analysis, systemic corticosteroid patients were more likely to be older, to be white, and to have diabetes, chronic obstructive pulmonary disease, malignancy, or previous chest radiation, Dr. Karnatovskaia said, noting that they were also more likely to have a lower body mass index and to be on a statin drug, inhaled steroid, inhaled beta-agonist, proton pump inhibitor, ACE inhibitor, angiotensin receptor blocker, or insulin and were less like likely to abuse alcohol or smoke tobacco.
After adjustment for significant covariates, systemic corticosteroid use was not independently associated with the development of ALI or the need for invasive ventilation, but did appear to be an independent predictor of ICU and hospital mortality, Dr. Karnatovskaia said. The latter association fell away, however, in the propensity score–based analysis. "Following propensity score–based analysis with matching, the association of prehospital systemic corticosteroids with mortality no longer remained significant," she said.
The findings are limited by the lack of data on the indication for systemic corticosteroid therapy, its duration, "and even whether it was continued throughout the hospital stay," as well as the fact that patients on prehospital systemic corticosteroids appeared to have worse functional status, which might have influenced their outcomes, according to Dr. Karnatovskaia. Although using the propensity score with matching addressed this as well as other hidden biases, "the potential for unmeasured effects remains," she said.
The study’s strengths include the large number of patients at risk for ALI enrolled from different centers and regions in the United States, as well as two hospitals in Turkey, and the use of comprehensive propensity score–based analysis with matching in addition to traditional logistic regression, Dr. Karnatovskaia said.
Ideally, the finding that prehospital use of systemic corticosteroids does not mitigate the development of ALI would be validated in a randomized controlled trial to best address any causal relationship, "but such a study would not be practical," Dr. Karnatovskaia said.
Dr. Karnatovskaia reported having no relevant financial disclosures.
HOUSTON – Prehospital use of systemic corticosteroids does not prevent the development of acute lung injury in at-risk patients, according to data reported at the annual congress of the Society of Critical Care Medicine.
In the first study to specifically evaluate the prophylactic value of prehospital systemic corticosteroids in patients with at least one risk factor for acute lung injury (ALI), Dr. Lioudmila Karnatovskaia of the Mayo Clinic in Jacksonville, Fla., and colleagues found a statistically similar incidence of ALI among at-risk patients who were and were not taking systemic corticosteroids at the time of hospitalization.
The investigators also determined that prehospital use of systemic corticosteroids did not affect the need for mechanical ventilation or overall mortality – a finding that appears to contradict previous studies that have linked preventive steroids in at-risk patients with increased rates of ALI and acute respiratory distress syndrome, Dr. Karnatovskaia said.
The study was a planned exploratory subgroup analysis of the Lung Injury Prediction Score cohort of the U.S. Critical Illness and Injury Trials Group, which prospectively enrolled 5,584 patients who were admitted to 22 acute care hospitals and who had predisposing conditions for ALI, including sepsis, shock, pancreatitis, pneumonia, aspiration, high-risk trauma, and high-risk surgery. The primary outcome was the development of ALI, and secondary outcomes were need for invasive ventilation and ICU and hospital mortality, Dr. Karnatovskaia said, noting that the data were analyzed using univariate, logistic regression, and propensity score–based analyses.
For the propensity analysis, "the propensity score balanced all of the covariates. Of the 458 patients on systemic corticosteroids, 443 were matched up to 1:4 to those not on systemic corticosteroids, for a total of 1,332 matched patients," she said. "We calculated adjusted risk for acute lung injury, invasive ventilation, and in-hospital mortality from the propensity score–matched sample using a conditional logistic regression model."
Of the 5,584 patients, 458 were on systemic corticosteroids at the time of hospitalization and 5,126 were not. Among the systemic corticosteroid group, 34 (7.4%) developed ALI, compared with 343 (6.7%) of those not taking them, Dr. Karnatovskaia reported. In the systemic corticosteroid group, 104 patients (23%) required mechanical ventilation and 35 patients (8%) died, compared with 1,752 (34%) and 172 (3%) of those not taking systemic corticosteroids, she said.
On univariate analysis, systemic corticosteroid patients were more likely to be older, to be white, and to have diabetes, chronic obstructive pulmonary disease, malignancy, or previous chest radiation, Dr. Karnatovskaia said, noting that they were also more likely to have a lower body mass index and to be on a statin drug, inhaled steroid, inhaled beta-agonist, proton pump inhibitor, ACE inhibitor, angiotensin receptor blocker, or insulin and were less like likely to abuse alcohol or smoke tobacco.
After adjustment for significant covariates, systemic corticosteroid use was not independently associated with the development of ALI or the need for invasive ventilation, but did appear to be an independent predictor of ICU and hospital mortality, Dr. Karnatovskaia said. The latter association fell away, however, in the propensity score–based analysis. "Following propensity score–based analysis with matching, the association of prehospital systemic corticosteroids with mortality no longer remained significant," she said.
The findings are limited by the lack of data on the indication for systemic corticosteroid therapy, its duration, "and even whether it was continued throughout the hospital stay," as well as the fact that patients on prehospital systemic corticosteroids appeared to have worse functional status, which might have influenced their outcomes, according to Dr. Karnatovskaia. Although using the propensity score with matching addressed this as well as other hidden biases, "the potential for unmeasured effects remains," she said.
The study’s strengths include the large number of patients at risk for ALI enrolled from different centers and regions in the United States, as well as two hospitals in Turkey, and the use of comprehensive propensity score–based analysis with matching in addition to traditional logistic regression, Dr. Karnatovskaia said.
Ideally, the finding that prehospital use of systemic corticosteroids does not mitigate the development of ALI would be validated in a randomized controlled trial to best address any causal relationship, "but such a study would not be practical," Dr. Karnatovskaia said.
Dr. Karnatovskaia reported having no relevant financial disclosures.
HOUSTON – Prehospital use of systemic corticosteroids does not prevent the development of acute lung injury in at-risk patients, according to data reported at the annual congress of the Society of Critical Care Medicine.
In the first study to specifically evaluate the prophylactic value of prehospital systemic corticosteroids in patients with at least one risk factor for acute lung injury (ALI), Dr. Lioudmila Karnatovskaia of the Mayo Clinic in Jacksonville, Fla., and colleagues found a statistically similar incidence of ALI among at-risk patients who were and were not taking systemic corticosteroids at the time of hospitalization.
The investigators also determined that prehospital use of systemic corticosteroids did not affect the need for mechanical ventilation or overall mortality – a finding that appears to contradict previous studies that have linked preventive steroids in at-risk patients with increased rates of ALI and acute respiratory distress syndrome, Dr. Karnatovskaia said.
The study was a planned exploratory subgroup analysis of the Lung Injury Prediction Score cohort of the U.S. Critical Illness and Injury Trials Group, which prospectively enrolled 5,584 patients who were admitted to 22 acute care hospitals and who had predisposing conditions for ALI, including sepsis, shock, pancreatitis, pneumonia, aspiration, high-risk trauma, and high-risk surgery. The primary outcome was the development of ALI, and secondary outcomes were need for invasive ventilation and ICU and hospital mortality, Dr. Karnatovskaia said, noting that the data were analyzed using univariate, logistic regression, and propensity score–based analyses.
For the propensity analysis, "the propensity score balanced all of the covariates. Of the 458 patients on systemic corticosteroids, 443 were matched up to 1:4 to those not on systemic corticosteroids, for a total of 1,332 matched patients," she said. "We calculated adjusted risk for acute lung injury, invasive ventilation, and in-hospital mortality from the propensity score–matched sample using a conditional logistic regression model."
Of the 5,584 patients, 458 were on systemic corticosteroids at the time of hospitalization and 5,126 were not. Among the systemic corticosteroid group, 34 (7.4%) developed ALI, compared with 343 (6.7%) of those not taking them, Dr. Karnatovskaia reported. In the systemic corticosteroid group, 104 patients (23%) required mechanical ventilation and 35 patients (8%) died, compared with 1,752 (34%) and 172 (3%) of those not taking systemic corticosteroids, she said.
On univariate analysis, systemic corticosteroid patients were more likely to be older, to be white, and to have diabetes, chronic obstructive pulmonary disease, malignancy, or previous chest radiation, Dr. Karnatovskaia said, noting that they were also more likely to have a lower body mass index and to be on a statin drug, inhaled steroid, inhaled beta-agonist, proton pump inhibitor, ACE inhibitor, angiotensin receptor blocker, or insulin and were less like likely to abuse alcohol or smoke tobacco.
After adjustment for significant covariates, systemic corticosteroid use was not independently associated with the development of ALI or the need for invasive ventilation, but did appear to be an independent predictor of ICU and hospital mortality, Dr. Karnatovskaia said. The latter association fell away, however, in the propensity score–based analysis. "Following propensity score–based analysis with matching, the association of prehospital systemic corticosteroids with mortality no longer remained significant," she said.
The findings are limited by the lack of data on the indication for systemic corticosteroid therapy, its duration, "and even whether it was continued throughout the hospital stay," as well as the fact that patients on prehospital systemic corticosteroids appeared to have worse functional status, which might have influenced their outcomes, according to Dr. Karnatovskaia. Although using the propensity score with matching addressed this as well as other hidden biases, "the potential for unmeasured effects remains," she said.
The study’s strengths include the large number of patients at risk for ALI enrolled from different centers and regions in the United States, as well as two hospitals in Turkey, and the use of comprehensive propensity score–based analysis with matching in addition to traditional logistic regression, Dr. Karnatovskaia said.
Ideally, the finding that prehospital use of systemic corticosteroids does not mitigate the development of ALI would be validated in a randomized controlled trial to best address any causal relationship, "but such a study would not be practical," Dr. Karnatovskaia said.
Dr. Karnatovskaia reported having no relevant financial disclosures.
FROM THE ANNUAL CONGRESS OF THE SOCIETY OF CRITICAL CARE MEDICINE
EDEN Trial Questions Restricting Nutrition to ALI Patients
HOUSTON – Restricting the amount of initial enteral intake in mechanically ventilated patients who have acute lung injury neither reduces the duration of mechanical ventilation nor improves mortality relative to full enteral feeding, but the nutritional strategy may be slightly easier on the stomach, according to a study reported Feb. 5 at the annual meeting of the Society of Critical Care Medicine Critical Care Congress.
The importance of nutrition support in critically ill patients with acute lung injury (ALI) is well accepted as a means of maintaining gut integrity, modulating both stress and the systemic immune response, and attenuating disease severity, but conflicting data regarding the timing, formulation, and amount of enteral nutrition have contributed to uncertainty about the optimal feeding protocol, according to Dr. Todd W. Rice of Vanderbilt University Medical Center in Nashville, Tenn.
"How much nutrition we need to promote the protective benefits, we don’t know. Providing a little bit of nutrition (called trophic feeding) has been shown to decrease intestinal intolerances, compared with full-calorie feeds, but it may do so at the risk of malnutrition, worse immune function, and loss of muscle strength," he said. Full-calorie feeding, on the other hand, may lead to more intolerances, may cause hyperglycemia and other imbalances, may increase septic complications, and may fuel the inflammatory fire, he added.
In the current study, which was published simultaneously in JAMA, Dr. Rice and colleagues in the EDEN (Early Vs. Delayed Enteral Nutrition in ALI) trial, sought to examine the relative advantages of restricting the amount of initial enteral intake in mechanically ventilated ALI patients. Specifically, the prospective, randomized, open-label trial compared the effect on clinical outcomes and survival of initial trophic enteral feeding – approximately 25% of the full target feeding – with initial full-calorie feeding for the first 6 days of mechanical ventilation in ALI patients. "We hypothesized that reduced trophic feeding during the first [6 days] would increase ventilator-free days and reduce instances of gastrointestinal intolerances compared with the conventional full enteral nutrition strategy," he said.
"How much nutrition we need to promote the protective benefits, we don’t know."
The study’s primary end point was ventilator-free days through day 28; secondary end points were daily percentage of goal enteral feeding, frequency of gastrointestinal intolerances, 60-day mortality before hospital discharge with unassisted breathing, ICU- and organ failure–free days, and new infections (JAMA 2012 Feb. 5 [doi:10.1001/jama2012.137]).
The multicenter study population comprised 1,000 patients, from January 2008 through mid-April 2011, who were initiated on mechanical ventilation within 48 hours of developing ALI. Within 6 hours of randomization, enteric nutrition was initiated in 508 patients assigned to trophic nutrition and 492 assigned to full feeding, and was continued until death, extubation, or day 6, Dr. Rice explained. Per standard protocol, enteral nutrition in the full-feeding group began at 25 mL/hr and advanced to goal weights (25-30 kcal/day of nonprotein calories and 1.2-1.6 g/kg per day of protein) as quickly as possible; gastric residual volumes were checked every 6 hours while enteral feeding was increased. In the trophic group, enteral feeding was initiated at 10-20 kcal/hr and gastric residual volumes were checked every 12 hours. After 6 days, patients in the trophic group who still required mechanical ventilation were advanced to the full-energy feeding rates, he said.
Baseline characteristics of the two groups were similar, Dr. Rice noted. "The primary etiologies of lung injury in both groups of patients were pneumonia and sepsis, and the average APACHE III [Acute Physiology and Chronic Health Evaluation III] score was approximately 92. These were sick patients," he said. For the first 6 days, the full- and trophic feeding groups received 1,300 kcal/day and 400 kcal/day, respectively.
With respect to the primary end point (28 days), the average number of ventilator-free days in both groups was similar, at 14.9 in the trophic group and 15.0 in the full-feeding group. "There were also no differences in 60-day mortality, organ failure–free days, ICU-free days, or the incidence of infection between groups," he said. Similarly, with respect to body mass index category or lung injury severity, "there were no between-group differences in ventilator free days or survival."
The full-feeding group did have a higher number of gastrointestinal intolerances on any one day, and statistically significant increase on days 2 and 3, but the overall percentages of intolerances were low, Dr. Rice said. There were no differences in albumin and protein levels between the groups over the first 7 days, he said.
Regarding the immediate clinical relevance of the findings, Dr. Rice stressed that the study wasn’t designed as an equivalence trial, "so I can’t tell you both feeding strategies are similar, but you can look at the results." In fact, he said, although the study did not show a benefit other than improved gastrointestinal tolerance, his group has moved toward trophic feeds because of the ease of administration. "Our nurses love the trophic feeds. Starting at 10-20 cc/hr and running it for 6 days is a lot less hassle than worrying about trying to ramp it up and get to goals," he said.
"Looking ahead, there are a number of places to go" with this research, Dr. Rice said. "Some of the questions we’ve thought about are what role does this play in the [total parenteral nutrition] question, and whether we need to be feeding patients at all. Initially, we thought the idea of not feeding patients would be a hard study to sell, but with these data, it may not be an unreasonable thing to look at."
Dr. Rice disclosed no financial conflicts of interest.
HOUSTON – Restricting the amount of initial enteral intake in mechanically ventilated patients who have acute lung injury neither reduces the duration of mechanical ventilation nor improves mortality relative to full enteral feeding, but the nutritional strategy may be slightly easier on the stomach, according to a study reported Feb. 5 at the annual meeting of the Society of Critical Care Medicine Critical Care Congress.
The importance of nutrition support in critically ill patients with acute lung injury (ALI) is well accepted as a means of maintaining gut integrity, modulating both stress and the systemic immune response, and attenuating disease severity, but conflicting data regarding the timing, formulation, and amount of enteral nutrition have contributed to uncertainty about the optimal feeding protocol, according to Dr. Todd W. Rice of Vanderbilt University Medical Center in Nashville, Tenn.
"How much nutrition we need to promote the protective benefits, we don’t know. Providing a little bit of nutrition (called trophic feeding) has been shown to decrease intestinal intolerances, compared with full-calorie feeds, but it may do so at the risk of malnutrition, worse immune function, and loss of muscle strength," he said. Full-calorie feeding, on the other hand, may lead to more intolerances, may cause hyperglycemia and other imbalances, may increase septic complications, and may fuel the inflammatory fire, he added.
In the current study, which was published simultaneously in JAMA, Dr. Rice and colleagues in the EDEN (Early Vs. Delayed Enteral Nutrition in ALI) trial, sought to examine the relative advantages of restricting the amount of initial enteral intake in mechanically ventilated ALI patients. Specifically, the prospective, randomized, open-label trial compared the effect on clinical outcomes and survival of initial trophic enteral feeding – approximately 25% of the full target feeding – with initial full-calorie feeding for the first 6 days of mechanical ventilation in ALI patients. "We hypothesized that reduced trophic feeding during the first [6 days] would increase ventilator-free days and reduce instances of gastrointestinal intolerances compared with the conventional full enteral nutrition strategy," he said.
"How much nutrition we need to promote the protective benefits, we don’t know."
The study’s primary end point was ventilator-free days through day 28; secondary end points were daily percentage of goal enteral feeding, frequency of gastrointestinal intolerances, 60-day mortality before hospital discharge with unassisted breathing, ICU- and organ failure–free days, and new infections (JAMA 2012 Feb. 5 [doi:10.1001/jama2012.137]).
The multicenter study population comprised 1,000 patients, from January 2008 through mid-April 2011, who were initiated on mechanical ventilation within 48 hours of developing ALI. Within 6 hours of randomization, enteric nutrition was initiated in 508 patients assigned to trophic nutrition and 492 assigned to full feeding, and was continued until death, extubation, or day 6, Dr. Rice explained. Per standard protocol, enteral nutrition in the full-feeding group began at 25 mL/hr and advanced to goal weights (25-30 kcal/day of nonprotein calories and 1.2-1.6 g/kg per day of protein) as quickly as possible; gastric residual volumes were checked every 6 hours while enteral feeding was increased. In the trophic group, enteral feeding was initiated at 10-20 kcal/hr and gastric residual volumes were checked every 12 hours. After 6 days, patients in the trophic group who still required mechanical ventilation were advanced to the full-energy feeding rates, he said.
Baseline characteristics of the two groups were similar, Dr. Rice noted. "The primary etiologies of lung injury in both groups of patients were pneumonia and sepsis, and the average APACHE III [Acute Physiology and Chronic Health Evaluation III] score was approximately 92. These were sick patients," he said. For the first 6 days, the full- and trophic feeding groups received 1,300 kcal/day and 400 kcal/day, respectively.
With respect to the primary end point (28 days), the average number of ventilator-free days in both groups was similar, at 14.9 in the trophic group and 15.0 in the full-feeding group. "There were also no differences in 60-day mortality, organ failure–free days, ICU-free days, or the incidence of infection between groups," he said. Similarly, with respect to body mass index category or lung injury severity, "there were no between-group differences in ventilator free days or survival."
The full-feeding group did have a higher number of gastrointestinal intolerances on any one day, and statistically significant increase on days 2 and 3, but the overall percentages of intolerances were low, Dr. Rice said. There were no differences in albumin and protein levels between the groups over the first 7 days, he said.
Regarding the immediate clinical relevance of the findings, Dr. Rice stressed that the study wasn’t designed as an equivalence trial, "so I can’t tell you both feeding strategies are similar, but you can look at the results." In fact, he said, although the study did not show a benefit other than improved gastrointestinal tolerance, his group has moved toward trophic feeds because of the ease of administration. "Our nurses love the trophic feeds. Starting at 10-20 cc/hr and running it for 6 days is a lot less hassle than worrying about trying to ramp it up and get to goals," he said.
"Looking ahead, there are a number of places to go" with this research, Dr. Rice said. "Some of the questions we’ve thought about are what role does this play in the [total parenteral nutrition] question, and whether we need to be feeding patients at all. Initially, we thought the idea of not feeding patients would be a hard study to sell, but with these data, it may not be an unreasonable thing to look at."
Dr. Rice disclosed no financial conflicts of interest.
HOUSTON – Restricting the amount of initial enteral intake in mechanically ventilated patients who have acute lung injury neither reduces the duration of mechanical ventilation nor improves mortality relative to full enteral feeding, but the nutritional strategy may be slightly easier on the stomach, according to a study reported Feb. 5 at the annual meeting of the Society of Critical Care Medicine Critical Care Congress.
The importance of nutrition support in critically ill patients with acute lung injury (ALI) is well accepted as a means of maintaining gut integrity, modulating both stress and the systemic immune response, and attenuating disease severity, but conflicting data regarding the timing, formulation, and amount of enteral nutrition have contributed to uncertainty about the optimal feeding protocol, according to Dr. Todd W. Rice of Vanderbilt University Medical Center in Nashville, Tenn.
"How much nutrition we need to promote the protective benefits, we don’t know. Providing a little bit of nutrition (called trophic feeding) has been shown to decrease intestinal intolerances, compared with full-calorie feeds, but it may do so at the risk of malnutrition, worse immune function, and loss of muscle strength," he said. Full-calorie feeding, on the other hand, may lead to more intolerances, may cause hyperglycemia and other imbalances, may increase septic complications, and may fuel the inflammatory fire, he added.
In the current study, which was published simultaneously in JAMA, Dr. Rice and colleagues in the EDEN (Early Vs. Delayed Enteral Nutrition in ALI) trial, sought to examine the relative advantages of restricting the amount of initial enteral intake in mechanically ventilated ALI patients. Specifically, the prospective, randomized, open-label trial compared the effect on clinical outcomes and survival of initial trophic enteral feeding – approximately 25% of the full target feeding – with initial full-calorie feeding for the first 6 days of mechanical ventilation in ALI patients. "We hypothesized that reduced trophic feeding during the first [6 days] would increase ventilator-free days and reduce instances of gastrointestinal intolerances compared with the conventional full enteral nutrition strategy," he said.
"How much nutrition we need to promote the protective benefits, we don’t know."
The study’s primary end point was ventilator-free days through day 28; secondary end points were daily percentage of goal enteral feeding, frequency of gastrointestinal intolerances, 60-day mortality before hospital discharge with unassisted breathing, ICU- and organ failure–free days, and new infections (JAMA 2012 Feb. 5 [doi:10.1001/jama2012.137]).
The multicenter study population comprised 1,000 patients, from January 2008 through mid-April 2011, who were initiated on mechanical ventilation within 48 hours of developing ALI. Within 6 hours of randomization, enteric nutrition was initiated in 508 patients assigned to trophic nutrition and 492 assigned to full feeding, and was continued until death, extubation, or day 6, Dr. Rice explained. Per standard protocol, enteral nutrition in the full-feeding group began at 25 mL/hr and advanced to goal weights (25-30 kcal/day of nonprotein calories and 1.2-1.6 g/kg per day of protein) as quickly as possible; gastric residual volumes were checked every 6 hours while enteral feeding was increased. In the trophic group, enteral feeding was initiated at 10-20 kcal/hr and gastric residual volumes were checked every 12 hours. After 6 days, patients in the trophic group who still required mechanical ventilation were advanced to the full-energy feeding rates, he said.
Baseline characteristics of the two groups were similar, Dr. Rice noted. "The primary etiologies of lung injury in both groups of patients were pneumonia and sepsis, and the average APACHE III [Acute Physiology and Chronic Health Evaluation III] score was approximately 92. These were sick patients," he said. For the first 6 days, the full- and trophic feeding groups received 1,300 kcal/day and 400 kcal/day, respectively.
With respect to the primary end point (28 days), the average number of ventilator-free days in both groups was similar, at 14.9 in the trophic group and 15.0 in the full-feeding group. "There were also no differences in 60-day mortality, organ failure–free days, ICU-free days, or the incidence of infection between groups," he said. Similarly, with respect to body mass index category or lung injury severity, "there were no between-group differences in ventilator free days or survival."
The full-feeding group did have a higher number of gastrointestinal intolerances on any one day, and statistically significant increase on days 2 and 3, but the overall percentages of intolerances were low, Dr. Rice said. There were no differences in albumin and protein levels between the groups over the first 7 days, he said.
Regarding the immediate clinical relevance of the findings, Dr. Rice stressed that the study wasn’t designed as an equivalence trial, "so I can’t tell you both feeding strategies are similar, but you can look at the results." In fact, he said, although the study did not show a benefit other than improved gastrointestinal tolerance, his group has moved toward trophic feeds because of the ease of administration. "Our nurses love the trophic feeds. Starting at 10-20 cc/hr and running it for 6 days is a lot less hassle than worrying about trying to ramp it up and get to goals," he said.
"Looking ahead, there are a number of places to go" with this research, Dr. Rice said. "Some of the questions we’ve thought about are what role does this play in the [total parenteral nutrition] question, and whether we need to be feeding patients at all. Initially, we thought the idea of not feeding patients would be a hard study to sell, but with these data, it may not be an unreasonable thing to look at."
Dr. Rice disclosed no financial conflicts of interest.
FROM THE ANNUAL CONGRESS OF THE SOCIETY OF CRITICAL CARE MEDICINE
Vets With PTSD: Individualized Vocational Support Ups Employment Odds
Unemployed veterans with post-traumatic stress disorder experience better employment outcomes when they receive individual job placement and support services, compared with standard vocational rehabilitation services, new research shows.
Recipients of evidence-based individual placement and support (IPS) were significantly more likely to gain competitive employment than were recipients of the standard Vocational Rehabilitation Program (VRP) services, Dr. Lori L. Davis of the Tuscaloosa (Alabama) Veterans Affairs Medical Center and her colleagues reported Feb. 2 online ahead of print in Psychiatric Services. Additional employment outcomes, including time worked and total earnings, also favored IPS, they wrote (Psychiatric Serv. 2012 [doi:10.1176/appi.ps.201100340]).
The study is the first to examine employment outcomes for veterans with PTSD who received IPS, compared with those with PTSD enrolled in the VRP, which is offered by the U.S. Department of Veterans Affairs.
For the prospective study, 85 unemployed veterans with PTSD aged 19-60 years at the Tuscaloosa VA Medical Center from 2006 to 2011 were randomized to either IPS-supported employment (42) or VRP treatment as usual (43). The employment rates and occupational outcomes of the veterans were followed for 12 months. All of the subjects were medically cleared to participate in a work activity, were interested in competitive employment, and were planning to remain within a 100-mile radius of the medical center for the duration of the study, the authors wrote.
Excluded from the study were veterans with a severe disorder resulting from severe traumatic brain injury; those diagnosed with schizophrenia, schizoaffective disorder, bipolar I disorder, or dementia; those with an immediate need for alcohol or drug detoxification; or those with pending active legal charges with expected incarceration, the researchers wrote.
The main tenets of the evidence-based IPS supported employment model are client choice, rapid job finding where appropriate, competitive education programs, integrated education and work settings, and follow-along supports. The individualized client-centered services are provided by a multidisciplinary team that integrates and coordinates treatment and rehabilitation, according to the authors. Standard VRP care includes routine prevocational testing and evaluation; vocational rehabilitation therapy comprising a work regimen and the use of special employer incentives; on-the-job training; apprenticeships; and on-paid work experience. Limited supportive rehabilitation and independent living services are also included, they stated.
At baseline, all of the study participants underwent a psychiatric and general medical evaluation, including a medical history, psychiatric history, and family psychiatric history. A clinical research coordinator also evaluated each participant for PTSD and other Axis I disorders using the Mini-International Neuropsychiatric Interview and at baseline and 1-, 2-, 3-, 4-, 6-, 8-, 10- , and 12-month follow-up visits. In addition, the coordinator reviewed subjects’ job logs, which included their employment status, number of hours worked, wages earned, and reasons for missed work.
An analysis of the study results showed that the 85 subjects had been employed for a mean18.9 months and, in addition to PTSD, 89% of the subjects had major depressive disorder, 20% had dysthymia, 54% had agoraphobia, 59% had panic disorder, 28% had social phobia, 42% had alcohol dependence, 21% had alcohol abuse, 37% had drug dependence, and 18% had drug abuse, the authors reported.
With respect to employment outcomes, 32 (76%) of the 42 IPS participants gained competitive employment, compared with 12 (28%) of the 43 VRP participants. Thus, the authors wrote, "... veterans with PTSD who participated in IPS were 2.7 times more likely to gain competitive employment than those who received VRP." The number needed to treat was 2.07, they noted. "In other words, if three individuals received IPS and three received VRP, one more individual in the IPS intervention would get a competitive job."
Of the eligible weeks during the study, IPS participants worked in a competitive job an average of 42%, compared with 16% for the VRP participants, the authors reported. Further, the mean total gross 12-month income for the IPS group was $9,264 – significantly higher than the mean $2,601 earned by the VRP group.
The study findings are limited by the single-site design and the exclusion of nonveterans, the authors acknowledged. "A multisite trial with a larger and more diverse study sample would confirm the results and allow examination of secondary outcomes, such as PTSD symptoms, quality of life, and other such outcomes," they wrote. "In addition, a larger study could evaluate the cost effectiveness of the IPS intervention."
The investigators disclosed financial relationships with AstraZeneca, Sunovion, Pfizer, MedAvante, and Roche.
Unemployed veterans with post-traumatic stress disorder experience better employment outcomes when they receive individual job placement and support services, compared with standard vocational rehabilitation services, new research shows.
Recipients of evidence-based individual placement and support (IPS) were significantly more likely to gain competitive employment than were recipients of the standard Vocational Rehabilitation Program (VRP) services, Dr. Lori L. Davis of the Tuscaloosa (Alabama) Veterans Affairs Medical Center and her colleagues reported Feb. 2 online ahead of print in Psychiatric Services. Additional employment outcomes, including time worked and total earnings, also favored IPS, they wrote (Psychiatric Serv. 2012 [doi:10.1176/appi.ps.201100340]).
The study is the first to examine employment outcomes for veterans with PTSD who received IPS, compared with those with PTSD enrolled in the VRP, which is offered by the U.S. Department of Veterans Affairs.
For the prospective study, 85 unemployed veterans with PTSD aged 19-60 years at the Tuscaloosa VA Medical Center from 2006 to 2011 were randomized to either IPS-supported employment (42) or VRP treatment as usual (43). The employment rates and occupational outcomes of the veterans were followed for 12 months. All of the subjects were medically cleared to participate in a work activity, were interested in competitive employment, and were planning to remain within a 100-mile radius of the medical center for the duration of the study, the authors wrote.
Excluded from the study were veterans with a severe disorder resulting from severe traumatic brain injury; those diagnosed with schizophrenia, schizoaffective disorder, bipolar I disorder, or dementia; those with an immediate need for alcohol or drug detoxification; or those with pending active legal charges with expected incarceration, the researchers wrote.
The main tenets of the evidence-based IPS supported employment model are client choice, rapid job finding where appropriate, competitive education programs, integrated education and work settings, and follow-along supports. The individualized client-centered services are provided by a multidisciplinary team that integrates and coordinates treatment and rehabilitation, according to the authors. Standard VRP care includes routine prevocational testing and evaluation; vocational rehabilitation therapy comprising a work regimen and the use of special employer incentives; on-the-job training; apprenticeships; and on-paid work experience. Limited supportive rehabilitation and independent living services are also included, they stated.
At baseline, all of the study participants underwent a psychiatric and general medical evaluation, including a medical history, psychiatric history, and family psychiatric history. A clinical research coordinator also evaluated each participant for PTSD and other Axis I disorders using the Mini-International Neuropsychiatric Interview and at baseline and 1-, 2-, 3-, 4-, 6-, 8-, 10- , and 12-month follow-up visits. In addition, the coordinator reviewed subjects’ job logs, which included their employment status, number of hours worked, wages earned, and reasons for missed work.
An analysis of the study results showed that the 85 subjects had been employed for a mean18.9 months and, in addition to PTSD, 89% of the subjects had major depressive disorder, 20% had dysthymia, 54% had agoraphobia, 59% had panic disorder, 28% had social phobia, 42% had alcohol dependence, 21% had alcohol abuse, 37% had drug dependence, and 18% had drug abuse, the authors reported.
With respect to employment outcomes, 32 (76%) of the 42 IPS participants gained competitive employment, compared with 12 (28%) of the 43 VRP participants. Thus, the authors wrote, "... veterans with PTSD who participated in IPS were 2.7 times more likely to gain competitive employment than those who received VRP." The number needed to treat was 2.07, they noted. "In other words, if three individuals received IPS and three received VRP, one more individual in the IPS intervention would get a competitive job."
Of the eligible weeks during the study, IPS participants worked in a competitive job an average of 42%, compared with 16% for the VRP participants, the authors reported. Further, the mean total gross 12-month income for the IPS group was $9,264 – significantly higher than the mean $2,601 earned by the VRP group.
The study findings are limited by the single-site design and the exclusion of nonveterans, the authors acknowledged. "A multisite trial with a larger and more diverse study sample would confirm the results and allow examination of secondary outcomes, such as PTSD symptoms, quality of life, and other such outcomes," they wrote. "In addition, a larger study could evaluate the cost effectiveness of the IPS intervention."
The investigators disclosed financial relationships with AstraZeneca, Sunovion, Pfizer, MedAvante, and Roche.
Unemployed veterans with post-traumatic stress disorder experience better employment outcomes when they receive individual job placement and support services, compared with standard vocational rehabilitation services, new research shows.
Recipients of evidence-based individual placement and support (IPS) were significantly more likely to gain competitive employment than were recipients of the standard Vocational Rehabilitation Program (VRP) services, Dr. Lori L. Davis of the Tuscaloosa (Alabama) Veterans Affairs Medical Center and her colleagues reported Feb. 2 online ahead of print in Psychiatric Services. Additional employment outcomes, including time worked and total earnings, also favored IPS, they wrote (Psychiatric Serv. 2012 [doi:10.1176/appi.ps.201100340]).
The study is the first to examine employment outcomes for veterans with PTSD who received IPS, compared with those with PTSD enrolled in the VRP, which is offered by the U.S. Department of Veterans Affairs.
For the prospective study, 85 unemployed veterans with PTSD aged 19-60 years at the Tuscaloosa VA Medical Center from 2006 to 2011 were randomized to either IPS-supported employment (42) or VRP treatment as usual (43). The employment rates and occupational outcomes of the veterans were followed for 12 months. All of the subjects were medically cleared to participate in a work activity, were interested in competitive employment, and were planning to remain within a 100-mile radius of the medical center for the duration of the study, the authors wrote.
Excluded from the study were veterans with a severe disorder resulting from severe traumatic brain injury; those diagnosed with schizophrenia, schizoaffective disorder, bipolar I disorder, or dementia; those with an immediate need for alcohol or drug detoxification; or those with pending active legal charges with expected incarceration, the researchers wrote.
The main tenets of the evidence-based IPS supported employment model are client choice, rapid job finding where appropriate, competitive education programs, integrated education and work settings, and follow-along supports. The individualized client-centered services are provided by a multidisciplinary team that integrates and coordinates treatment and rehabilitation, according to the authors. Standard VRP care includes routine prevocational testing and evaluation; vocational rehabilitation therapy comprising a work regimen and the use of special employer incentives; on-the-job training; apprenticeships; and on-paid work experience. Limited supportive rehabilitation and independent living services are also included, they stated.
At baseline, all of the study participants underwent a psychiatric and general medical evaluation, including a medical history, psychiatric history, and family psychiatric history. A clinical research coordinator also evaluated each participant for PTSD and other Axis I disorders using the Mini-International Neuropsychiatric Interview and at baseline and 1-, 2-, 3-, 4-, 6-, 8-, 10- , and 12-month follow-up visits. In addition, the coordinator reviewed subjects’ job logs, which included their employment status, number of hours worked, wages earned, and reasons for missed work.
An analysis of the study results showed that the 85 subjects had been employed for a mean18.9 months and, in addition to PTSD, 89% of the subjects had major depressive disorder, 20% had dysthymia, 54% had agoraphobia, 59% had panic disorder, 28% had social phobia, 42% had alcohol dependence, 21% had alcohol abuse, 37% had drug dependence, and 18% had drug abuse, the authors reported.
With respect to employment outcomes, 32 (76%) of the 42 IPS participants gained competitive employment, compared with 12 (28%) of the 43 VRP participants. Thus, the authors wrote, "... veterans with PTSD who participated in IPS were 2.7 times more likely to gain competitive employment than those who received VRP." The number needed to treat was 2.07, they noted. "In other words, if three individuals received IPS and three received VRP, one more individual in the IPS intervention would get a competitive job."
Of the eligible weeks during the study, IPS participants worked in a competitive job an average of 42%, compared with 16% for the VRP participants, the authors reported. Further, the mean total gross 12-month income for the IPS group was $9,264 – significantly higher than the mean $2,601 earned by the VRP group.
The study findings are limited by the single-site design and the exclusion of nonveterans, the authors acknowledged. "A multisite trial with a larger and more diverse study sample would confirm the results and allow examination of secondary outcomes, such as PTSD symptoms, quality of life, and other such outcomes," they wrote. "In addition, a larger study could evaluate the cost effectiveness of the IPS intervention."
The investigators disclosed financial relationships with AstraZeneca, Sunovion, Pfizer, MedAvante, and Roche.
FROM PSYCHIATRIC SERVICES
Major Finding: Unemployed veterans with PTSD who receive individual placement and support employment services were 2.7 times more likely to gain competitive employment then were those who receive standard vocational rehabilitation services.
Data Source: Single-site prospective randomized trial comprising 85 unemployed veterans with PTSD assigned to either individualized placement or support or standard vocational rehabilitation services for a 12-month period.
Disclosures: The investigators disclosed financial relationships with AstraZeneca, Sunovion, Pfizer, MedAvante, and Roche.
Updated Criteria Revisit PCI Appropriateness Scenarios
Percutaneous coronary intervention is an appropriate procedure for patients with two-vessel coronary artery disease with proximal left anterior descending artery and for those who have three-vessel disease with a low coronary artery disease burden, but it may not be reasonable for patients with three-vessel disease and intermediate-to-high CAD burden.
That’s according to new appropriate use criteria for coronary revascularization released Jan. 30 by the American College of Cardiology Foundation and key specialty and subspecialty societies.
In the new document – the first focused update to the original Appropriate Use Criteria (AUC) for Coronary Revascularization published in 2009 – the latter scenario is graded as "uncertain," as is percutaneous coronary intervention (PCI) for patients with isolated left main stenosis and those with left main stenosis and additional CAD with low CAD burden. Further, the procedure is inappropriate for patients with left main stenosis and additional CAD with intermediate to high CAD burden, Dr. Manesh Patel of Duke University, Durham, N.C., and colleagues on the Appropriate Use Criteria Task Force reported (J. Am. Coll. Cardiol. 2012 Jan. 30 [doi: 10.1016/j.jacc.2011.12.001]). In the previous appropriate use document, PCI was deemed inappropriate for low burden left main disease and uncertain for low-burden three-vessel disease, they stated. Coronary artery bypass grafting (CABG), on the other hand, maintained its 2009 rating of appropriate for all six clinical scenarios, the authors wrote.
In addition to the changes to the PCI appropriateness ratings above, the updated ratings indicate that coronary revascularization of the presumed culprit artery is appropriate for acute coronary syndrome patients with unstable angina/non–ST-segment elevation MI (UA/NSTEMI) with intermediate risk features – defined as a thrombolysis in myocardial infarction (TIMI) score of 3-4 – for short-term risk of death or nonfatal MI, while it is uncertain for UA/NSTEMI patients with low-risk features (TIMI score of 2 or less) for short-term risk of death or non-fatal MI. Among asymptomatic patients without prior bypass surgery, revascularization is inappropriate for those with one or two-vessel CAD with no proximal left anterior descending artery involvement and no history of invasive testing.
The updated criteria are intended to fill in the gaps identified in the prior criteria and to take into account results of clinical trials that have been reported since the initial publication, including the Synergy Between PCI With TAXUS and Cardiac Surgery (SYNTAX) trial comparing the two revascularization procedures in patients with left main or triple-vessel CAD, the authors wrote.
For the 2009 criteria, the writing group identified 180 clinical scenarios reflecting common patient presentations in everyday cardiology practice which were then rated by an expert panel comprising interventional and noninterventional cardiologists, surgeons, internal medicine physicians, and health outcomes researchers using a modified Delphi exercise to assess whether an invasive revascularization procedure would be appropriate, inappropriate, or uncertain based on symptom status, extent of medical therapy, risk level, and coronary anatomy (J. Am. Coll. Cardiol. 2009;53:530-53).
For the updated document, the writing group reassessed the clinical scenarios and identified those warranting reevaluation, expansion, or consolidation, the authors explained. In this regard, they identified and reexamined four indications possibly affected by the results of the SYNTAX trial, splitting two of them to represent levels of disease burden, as noted above. They also identified a gap in the clinical scenarios related to lower-risk UA/NSTEMI patients and asymptomatic patients with one- or two-vessel CAD not involving the proximal left anterior descending artery in whom no noninvasive testing had been performed, and developed indications to address the omissions.
Basing appropriate use criteria on current understanding of the technical capabilities and potential patient benefits, the technical panel scored each indication on a scale from 1-9. A given procedure was considered appropriate for an indication if the median score was 7-9; uncertain if the median scores was 4-6; and inappropriate if the median score was 1-3.
Clinicians can use the updated criteria as decision support or educational tools when considering the need for revascularization, the authors wrote. "Moreover, these criteria can be used to facilitate discussion with patients and/or referring physicians about revascularization," they stated, noting also that "facilities and payers may choose to use these criteria either prospectively in the design of protocols or pre-authorization procedures, or retrospectively for quality reports."
Importantly, the appropriate use criteria "are intended to evaluate overall patterns of care regarding revascularization rather than adjudicating specific cases," the authors wrote. While the criteria reflect a general assessment of when revascularization may or may not be useful for specific patient populations, "physicians and other stakeholders should continue to acknowledge the pivotal role of clinical judgment in determining whether revascularization is indicated for an individual patient."
Dr. Patel reported having no financial disclosures. Relationships with industry of all the writing committee members are included in the AUC document.
Percutaneous coronary intervention is an appropriate procedure for patients with two-vessel coronary artery disease with proximal left anterior descending artery and for those who have three-vessel disease with a low coronary artery disease burden, but it may not be reasonable for patients with three-vessel disease and intermediate-to-high CAD burden.
That’s according to new appropriate use criteria for coronary revascularization released Jan. 30 by the American College of Cardiology Foundation and key specialty and subspecialty societies.
In the new document – the first focused update to the original Appropriate Use Criteria (AUC) for Coronary Revascularization published in 2009 – the latter scenario is graded as "uncertain," as is percutaneous coronary intervention (PCI) for patients with isolated left main stenosis and those with left main stenosis and additional CAD with low CAD burden. Further, the procedure is inappropriate for patients with left main stenosis and additional CAD with intermediate to high CAD burden, Dr. Manesh Patel of Duke University, Durham, N.C., and colleagues on the Appropriate Use Criteria Task Force reported (J. Am. Coll. Cardiol. 2012 Jan. 30 [doi: 10.1016/j.jacc.2011.12.001]). In the previous appropriate use document, PCI was deemed inappropriate for low burden left main disease and uncertain for low-burden three-vessel disease, they stated. Coronary artery bypass grafting (CABG), on the other hand, maintained its 2009 rating of appropriate for all six clinical scenarios, the authors wrote.
In addition to the changes to the PCI appropriateness ratings above, the updated ratings indicate that coronary revascularization of the presumed culprit artery is appropriate for acute coronary syndrome patients with unstable angina/non–ST-segment elevation MI (UA/NSTEMI) with intermediate risk features – defined as a thrombolysis in myocardial infarction (TIMI) score of 3-4 – for short-term risk of death or nonfatal MI, while it is uncertain for UA/NSTEMI patients with low-risk features (TIMI score of 2 or less) for short-term risk of death or non-fatal MI. Among asymptomatic patients without prior bypass surgery, revascularization is inappropriate for those with one or two-vessel CAD with no proximal left anterior descending artery involvement and no history of invasive testing.
The updated criteria are intended to fill in the gaps identified in the prior criteria and to take into account results of clinical trials that have been reported since the initial publication, including the Synergy Between PCI With TAXUS and Cardiac Surgery (SYNTAX) trial comparing the two revascularization procedures in patients with left main or triple-vessel CAD, the authors wrote.
For the 2009 criteria, the writing group identified 180 clinical scenarios reflecting common patient presentations in everyday cardiology practice which were then rated by an expert panel comprising interventional and noninterventional cardiologists, surgeons, internal medicine physicians, and health outcomes researchers using a modified Delphi exercise to assess whether an invasive revascularization procedure would be appropriate, inappropriate, or uncertain based on symptom status, extent of medical therapy, risk level, and coronary anatomy (J. Am. Coll. Cardiol. 2009;53:530-53).
For the updated document, the writing group reassessed the clinical scenarios and identified those warranting reevaluation, expansion, or consolidation, the authors explained. In this regard, they identified and reexamined four indications possibly affected by the results of the SYNTAX trial, splitting two of them to represent levels of disease burden, as noted above. They also identified a gap in the clinical scenarios related to lower-risk UA/NSTEMI patients and asymptomatic patients with one- or two-vessel CAD not involving the proximal left anterior descending artery in whom no noninvasive testing had been performed, and developed indications to address the omissions.
Basing appropriate use criteria on current understanding of the technical capabilities and potential patient benefits, the technical panel scored each indication on a scale from 1-9. A given procedure was considered appropriate for an indication if the median score was 7-9; uncertain if the median scores was 4-6; and inappropriate if the median score was 1-3.
Clinicians can use the updated criteria as decision support or educational tools when considering the need for revascularization, the authors wrote. "Moreover, these criteria can be used to facilitate discussion with patients and/or referring physicians about revascularization," they stated, noting also that "facilities and payers may choose to use these criteria either prospectively in the design of protocols or pre-authorization procedures, or retrospectively for quality reports."
Importantly, the appropriate use criteria "are intended to evaluate overall patterns of care regarding revascularization rather than adjudicating specific cases," the authors wrote. While the criteria reflect a general assessment of when revascularization may or may not be useful for specific patient populations, "physicians and other stakeholders should continue to acknowledge the pivotal role of clinical judgment in determining whether revascularization is indicated for an individual patient."
Dr. Patel reported having no financial disclosures. Relationships with industry of all the writing committee members are included in the AUC document.
Percutaneous coronary intervention is an appropriate procedure for patients with two-vessel coronary artery disease with proximal left anterior descending artery and for those who have three-vessel disease with a low coronary artery disease burden, but it may not be reasonable for patients with three-vessel disease and intermediate-to-high CAD burden.
That’s according to new appropriate use criteria for coronary revascularization released Jan. 30 by the American College of Cardiology Foundation and key specialty and subspecialty societies.
In the new document – the first focused update to the original Appropriate Use Criteria (AUC) for Coronary Revascularization published in 2009 – the latter scenario is graded as "uncertain," as is percutaneous coronary intervention (PCI) for patients with isolated left main stenosis and those with left main stenosis and additional CAD with low CAD burden. Further, the procedure is inappropriate for patients with left main stenosis and additional CAD with intermediate to high CAD burden, Dr. Manesh Patel of Duke University, Durham, N.C., and colleagues on the Appropriate Use Criteria Task Force reported (J. Am. Coll. Cardiol. 2012 Jan. 30 [doi: 10.1016/j.jacc.2011.12.001]). In the previous appropriate use document, PCI was deemed inappropriate for low burden left main disease and uncertain for low-burden three-vessel disease, they stated. Coronary artery bypass grafting (CABG), on the other hand, maintained its 2009 rating of appropriate for all six clinical scenarios, the authors wrote.
In addition to the changes to the PCI appropriateness ratings above, the updated ratings indicate that coronary revascularization of the presumed culprit artery is appropriate for acute coronary syndrome patients with unstable angina/non–ST-segment elevation MI (UA/NSTEMI) with intermediate risk features – defined as a thrombolysis in myocardial infarction (TIMI) score of 3-4 – for short-term risk of death or nonfatal MI, while it is uncertain for UA/NSTEMI patients with low-risk features (TIMI score of 2 or less) for short-term risk of death or non-fatal MI. Among asymptomatic patients without prior bypass surgery, revascularization is inappropriate for those with one or two-vessel CAD with no proximal left anterior descending artery involvement and no history of invasive testing.
The updated criteria are intended to fill in the gaps identified in the prior criteria and to take into account results of clinical trials that have been reported since the initial publication, including the Synergy Between PCI With TAXUS and Cardiac Surgery (SYNTAX) trial comparing the two revascularization procedures in patients with left main or triple-vessel CAD, the authors wrote.
For the 2009 criteria, the writing group identified 180 clinical scenarios reflecting common patient presentations in everyday cardiology practice which were then rated by an expert panel comprising interventional and noninterventional cardiologists, surgeons, internal medicine physicians, and health outcomes researchers using a modified Delphi exercise to assess whether an invasive revascularization procedure would be appropriate, inappropriate, or uncertain based on symptom status, extent of medical therapy, risk level, and coronary anatomy (J. Am. Coll. Cardiol. 2009;53:530-53).
For the updated document, the writing group reassessed the clinical scenarios and identified those warranting reevaluation, expansion, or consolidation, the authors explained. In this regard, they identified and reexamined four indications possibly affected by the results of the SYNTAX trial, splitting two of them to represent levels of disease burden, as noted above. They also identified a gap in the clinical scenarios related to lower-risk UA/NSTEMI patients and asymptomatic patients with one- or two-vessel CAD not involving the proximal left anterior descending artery in whom no noninvasive testing had been performed, and developed indications to address the omissions.
Basing appropriate use criteria on current understanding of the technical capabilities and potential patient benefits, the technical panel scored each indication on a scale from 1-9. A given procedure was considered appropriate for an indication if the median score was 7-9; uncertain if the median scores was 4-6; and inappropriate if the median score was 1-3.
Clinicians can use the updated criteria as decision support or educational tools when considering the need for revascularization, the authors wrote. "Moreover, these criteria can be used to facilitate discussion with patients and/or referring physicians about revascularization," they stated, noting also that "facilities and payers may choose to use these criteria either prospectively in the design of protocols or pre-authorization procedures, or retrospectively for quality reports."
Importantly, the appropriate use criteria "are intended to evaluate overall patterns of care regarding revascularization rather than adjudicating specific cases," the authors wrote. While the criteria reflect a general assessment of when revascularization may or may not be useful for specific patient populations, "physicians and other stakeholders should continue to acknowledge the pivotal role of clinical judgment in determining whether revascularization is indicated for an individual patient."
Dr. Patel reported having no financial disclosures. Relationships with industry of all the writing committee members are included in the AUC document.
FROM THE JOURNAL OF THE AMERICAN COLLEGE OF CARDIOLOGY
Lower Extremity Amputations Decline Among Diabetic Patients
Discharge rates for nontraumatic lower extremity amputation among diabetic individuals aged 40 years and older declined 65% from 1996 to 2008, according to a study by the Centers for Disease and Prevention, announced by the agency on Jan. 24 and also published in the February issue of Diabetes Care.
The CDC’s study is the first comprehensive study to examine trends in nontraumatic lower extremity amputation (NLEA) rates and characteristics associated with diabetes-related NLEAs in the U.S. population, Yanfeng Li, MPH, and colleagues at the CDC in Atlanta analyzed data from two nationally representative surveys – the National Hospital Discharge Survey (NHDS) and the National Health Interview Survey (NHIS) – and determined that the age-adjusted NLEA discharge rate per 1,000 persons among diabetic individuals aged 40 years or older decreased from 11.2 in 1996 to 3.9 in 2008, while rates among nondiabetic individuals remained similar. However, even with the dramatic decline, the age-adjusted NLEA rate in the diabetic population remained approximately eight times higher than the rate observed in the nondiabetic population, at 3.9 vs. 0.5 per 1,000 persons, respectively, the authors wrote (Diabetes Care Feb. 2012;35:273-7).
When analyzed by demographic characteristics within the diabetic population, the NLEA rates decreased significantly in all of the demographic groups considered. Throughout the 12-year study period, however, the rates were significantly higher among diabetic patients 75 years or older, compared with those aged 40-64 years and those aged 65-74 years, the authors reported. They were also significantly higher among men than women and among blacks than whites, they stated.
Although the findings are limited by several factors, including the underestimation of the size of the total diabetic population (estimates did not include individuals with undiagnosed disease or those in nursing homes); the lack of inclusion of data on NLEAs performed in long-term hospitals, Veterans Affairs hospitals, or outpatient settings; possible duplicity of patients hospitalized more than once in a given year; and the absence of a racial designation for a large proportion of patients, they nevertheless indicate that increased attention to risk-factor management, patient education, and appropriate foot care in recent years have led to a reduction in NLEA hospitalizations, the authors wrote. The persistent racial disparities and continued increased risk for NLEA among diabetic patients suggest more can be done. "Further decreases in rates of NLEA will require continued awareness of diabetes and its complications among patients and providers as well as comprehensive interventions to reduce the prevalence of risk factors for NLEA and to improve foot care and overall care for people with diabetes, particularly for those in subpopulations at higher risk for NLEA," they concluded.
The authors reported having no relevant conflicts of interest.
Discharge rates for nontraumatic lower extremity amputation among diabetic individuals aged 40 years and older declined 65% from 1996 to 2008, according to a study by the Centers for Disease and Prevention, announced by the agency on Jan. 24 and also published in the February issue of Diabetes Care.
The CDC’s study is the first comprehensive study to examine trends in nontraumatic lower extremity amputation (NLEA) rates and characteristics associated with diabetes-related NLEAs in the U.S. population, Yanfeng Li, MPH, and colleagues at the CDC in Atlanta analyzed data from two nationally representative surveys – the National Hospital Discharge Survey (NHDS) and the National Health Interview Survey (NHIS) – and determined that the age-adjusted NLEA discharge rate per 1,000 persons among diabetic individuals aged 40 years or older decreased from 11.2 in 1996 to 3.9 in 2008, while rates among nondiabetic individuals remained similar. However, even with the dramatic decline, the age-adjusted NLEA rate in the diabetic population remained approximately eight times higher than the rate observed in the nondiabetic population, at 3.9 vs. 0.5 per 1,000 persons, respectively, the authors wrote (Diabetes Care Feb. 2012;35:273-7).
When analyzed by demographic characteristics within the diabetic population, the NLEA rates decreased significantly in all of the demographic groups considered. Throughout the 12-year study period, however, the rates were significantly higher among diabetic patients 75 years or older, compared with those aged 40-64 years and those aged 65-74 years, the authors reported. They were also significantly higher among men than women and among blacks than whites, they stated.
Although the findings are limited by several factors, including the underestimation of the size of the total diabetic population (estimates did not include individuals with undiagnosed disease or those in nursing homes); the lack of inclusion of data on NLEAs performed in long-term hospitals, Veterans Affairs hospitals, or outpatient settings; possible duplicity of patients hospitalized more than once in a given year; and the absence of a racial designation for a large proportion of patients, they nevertheless indicate that increased attention to risk-factor management, patient education, and appropriate foot care in recent years have led to a reduction in NLEA hospitalizations, the authors wrote. The persistent racial disparities and continued increased risk for NLEA among diabetic patients suggest more can be done. "Further decreases in rates of NLEA will require continued awareness of diabetes and its complications among patients and providers as well as comprehensive interventions to reduce the prevalence of risk factors for NLEA and to improve foot care and overall care for people with diabetes, particularly for those in subpopulations at higher risk for NLEA," they concluded.
The authors reported having no relevant conflicts of interest.
Discharge rates for nontraumatic lower extremity amputation among diabetic individuals aged 40 years and older declined 65% from 1996 to 2008, according to a study by the Centers for Disease and Prevention, announced by the agency on Jan. 24 and also published in the February issue of Diabetes Care.
The CDC’s study is the first comprehensive study to examine trends in nontraumatic lower extremity amputation (NLEA) rates and characteristics associated with diabetes-related NLEAs in the U.S. population, Yanfeng Li, MPH, and colleagues at the CDC in Atlanta analyzed data from two nationally representative surveys – the National Hospital Discharge Survey (NHDS) and the National Health Interview Survey (NHIS) – and determined that the age-adjusted NLEA discharge rate per 1,000 persons among diabetic individuals aged 40 years or older decreased from 11.2 in 1996 to 3.9 in 2008, while rates among nondiabetic individuals remained similar. However, even with the dramatic decline, the age-adjusted NLEA rate in the diabetic population remained approximately eight times higher than the rate observed in the nondiabetic population, at 3.9 vs. 0.5 per 1,000 persons, respectively, the authors wrote (Diabetes Care Feb. 2012;35:273-7).
When analyzed by demographic characteristics within the diabetic population, the NLEA rates decreased significantly in all of the demographic groups considered. Throughout the 12-year study period, however, the rates were significantly higher among diabetic patients 75 years or older, compared with those aged 40-64 years and those aged 65-74 years, the authors reported. They were also significantly higher among men than women and among blacks than whites, they stated.
Although the findings are limited by several factors, including the underestimation of the size of the total diabetic population (estimates did not include individuals with undiagnosed disease or those in nursing homes); the lack of inclusion of data on NLEAs performed in long-term hospitals, Veterans Affairs hospitals, or outpatient settings; possible duplicity of patients hospitalized more than once in a given year; and the absence of a racial designation for a large proportion of patients, they nevertheless indicate that increased attention to risk-factor management, patient education, and appropriate foot care in recent years have led to a reduction in NLEA hospitalizations, the authors wrote. The persistent racial disparities and continued increased risk for NLEA among diabetic patients suggest more can be done. "Further decreases in rates of NLEA will require continued awareness of diabetes and its complications among patients and providers as well as comprehensive interventions to reduce the prevalence of risk factors for NLEA and to improve foot care and overall care for people with diabetes, particularly for those in subpopulations at higher risk for NLEA," they concluded.
The authors reported having no relevant conflicts of interest.
FROM DIABETES CARE
Major Finding: The rates of hospitalization for nontraumatic lower-extremity amputation declined among diabetic patients from 11.2/1,000 persons in 1996 to 3.9/1,000 persons in 2008.
Data Source: An examination of trends in hospitalizations for nontraumatic lower-extremity amputations in the overall U.S. population using data from the National Hospital Discharge Survey and the National Health Interview Survey.
Disclosures: The authors disclosed no relevant conflicts of interest.
Brain Deficits Not Evident in Early Psychosis
The characteristic structural and functional brain deficits associated with chronic schizophrenia do not appear to be fully present in early psychosis patients, a report published in the Jan. 15 issue of Biological Psychiatry shows.
The findings suggest that early intervention strategies could delay, reduce, or prevent the development of brain deficits in some patients with early psychosis.
Previous studies have linked dynamic changes to hippocampal volume and neural activation with the transition to psychosis, but have not determined when such changes occur with respect to disease progression, wrote Lisa E. Williams, Ph.D., and her colleagues at Vanderbilt University in Nashville, Tenn. Because of the potential treatment implications, the investigators sought to ascertain whether hippocampal integrity is compromised in the early stage of a psychotic illness.
The investigators compared structural MRI data for 41 early psychosis patients and 34 healthy controls. Functional fMRI data were analyzed for 27 of the early psychosis patients and 30 healthy controls subjects who met memory training criteria. Most of the early psychosis patients had first episode schizophrenia.
The early psychosis and control subjects in both the structural and functional MRI analysis cohorts were similar with respect to age, race, sex, and parental education, the authors wrote. All of the study participants completed a transitive interference task that has been used in the assessment of chronic schizophrenia patients, whereby they learned to identify the winning set of two sets of stimulus pairs drawn from an overlapping sequence and a nonoverlapping set. During the fMRI scan, they were tested on the previously learned conditions and asked to make inferential judgments on new pairings from each condition to evaluate hippocampal activation. With respect to the structural neuroimaging data, a single, blinded rater completed volumetric analysis, manually segmenting the volumes and conducting between-group comparisons of the volumes, which were corrected for individual intracranial volume.
The investigators determined that the early psychosis and control groups did not differ on inference performance or hippocampal volume, and both groups exhibited similar activation of medial temporal regions when judging nonoverlapping pairs, suggesting that "established abnormalities of brain structure and function found in chronic psychosis patients are not fully present in all early psychosis patients," they wrote (Biol. Psychiatry 2012;71:105-13). This result, however, appeared to be specific to those individuals with intact relational memory performance, whereas those who did not meet memory training criteria (25% of the early psychosis group) had smaller hippocampal volumes. This finding supports a link between relational memory and hippocampal integrity. In addition, it indicates that memory deficits and hippocampal volume reductions are present in some early psychosis patients.
The findings are limited by several factors. For example, the training performance criteria needed to test for a selective inference deficit limits the generalizability of the results to all psychotic patients. Another limitation was the short mean duration of illness. "Only 7 of 41 patients had a duration of treated illness greater than 6 months, and this factor was not correlated with inference performance," they stated.
Despite the limitations, the current findings "indicate that hippocampal dysfunction, well established in chronic schizophrenia patients, is not present at the onset of psychosis in all patients with early psychosis, providing an opportunity for clinical intervention."
The investigators disclosed having no relevant financial conflicts of interest.
The characteristic structural and functional brain deficits associated with chronic schizophrenia do not appear to be fully present in early psychosis patients, a report published in the Jan. 15 issue of Biological Psychiatry shows.
The findings suggest that early intervention strategies could delay, reduce, or prevent the development of brain deficits in some patients with early psychosis.
Previous studies have linked dynamic changes to hippocampal volume and neural activation with the transition to psychosis, but have not determined when such changes occur with respect to disease progression, wrote Lisa E. Williams, Ph.D., and her colleagues at Vanderbilt University in Nashville, Tenn. Because of the potential treatment implications, the investigators sought to ascertain whether hippocampal integrity is compromised in the early stage of a psychotic illness.
The investigators compared structural MRI data for 41 early psychosis patients and 34 healthy controls. Functional fMRI data were analyzed for 27 of the early psychosis patients and 30 healthy controls subjects who met memory training criteria. Most of the early psychosis patients had first episode schizophrenia.
The early psychosis and control subjects in both the structural and functional MRI analysis cohorts were similar with respect to age, race, sex, and parental education, the authors wrote. All of the study participants completed a transitive interference task that has been used in the assessment of chronic schizophrenia patients, whereby they learned to identify the winning set of two sets of stimulus pairs drawn from an overlapping sequence and a nonoverlapping set. During the fMRI scan, they were tested on the previously learned conditions and asked to make inferential judgments on new pairings from each condition to evaluate hippocampal activation. With respect to the structural neuroimaging data, a single, blinded rater completed volumetric analysis, manually segmenting the volumes and conducting between-group comparisons of the volumes, which were corrected for individual intracranial volume.
The investigators determined that the early psychosis and control groups did not differ on inference performance or hippocampal volume, and both groups exhibited similar activation of medial temporal regions when judging nonoverlapping pairs, suggesting that "established abnormalities of brain structure and function found in chronic psychosis patients are not fully present in all early psychosis patients," they wrote (Biol. Psychiatry 2012;71:105-13). This result, however, appeared to be specific to those individuals with intact relational memory performance, whereas those who did not meet memory training criteria (25% of the early psychosis group) had smaller hippocampal volumes. This finding supports a link between relational memory and hippocampal integrity. In addition, it indicates that memory deficits and hippocampal volume reductions are present in some early psychosis patients.
The findings are limited by several factors. For example, the training performance criteria needed to test for a selective inference deficit limits the generalizability of the results to all psychotic patients. Another limitation was the short mean duration of illness. "Only 7 of 41 patients had a duration of treated illness greater than 6 months, and this factor was not correlated with inference performance," they stated.
Despite the limitations, the current findings "indicate that hippocampal dysfunction, well established in chronic schizophrenia patients, is not present at the onset of psychosis in all patients with early psychosis, providing an opportunity for clinical intervention."
The investigators disclosed having no relevant financial conflicts of interest.
The characteristic structural and functional brain deficits associated with chronic schizophrenia do not appear to be fully present in early psychosis patients, a report published in the Jan. 15 issue of Biological Psychiatry shows.
The findings suggest that early intervention strategies could delay, reduce, or prevent the development of brain deficits in some patients with early psychosis.
Previous studies have linked dynamic changes to hippocampal volume and neural activation with the transition to psychosis, but have not determined when such changes occur with respect to disease progression, wrote Lisa E. Williams, Ph.D., and her colleagues at Vanderbilt University in Nashville, Tenn. Because of the potential treatment implications, the investigators sought to ascertain whether hippocampal integrity is compromised in the early stage of a psychotic illness.
The investigators compared structural MRI data for 41 early psychosis patients and 34 healthy controls. Functional fMRI data were analyzed for 27 of the early psychosis patients and 30 healthy controls subjects who met memory training criteria. Most of the early psychosis patients had first episode schizophrenia.
The early psychosis and control subjects in both the structural and functional MRI analysis cohorts were similar with respect to age, race, sex, and parental education, the authors wrote. All of the study participants completed a transitive interference task that has been used in the assessment of chronic schizophrenia patients, whereby they learned to identify the winning set of two sets of stimulus pairs drawn from an overlapping sequence and a nonoverlapping set. During the fMRI scan, they were tested on the previously learned conditions and asked to make inferential judgments on new pairings from each condition to evaluate hippocampal activation. With respect to the structural neuroimaging data, a single, blinded rater completed volumetric analysis, manually segmenting the volumes and conducting between-group comparisons of the volumes, which were corrected for individual intracranial volume.
The investigators determined that the early psychosis and control groups did not differ on inference performance or hippocampal volume, and both groups exhibited similar activation of medial temporal regions when judging nonoverlapping pairs, suggesting that "established abnormalities of brain structure and function found in chronic psychosis patients are not fully present in all early psychosis patients," they wrote (Biol. Psychiatry 2012;71:105-13). This result, however, appeared to be specific to those individuals with intact relational memory performance, whereas those who did not meet memory training criteria (25% of the early psychosis group) had smaller hippocampal volumes. This finding supports a link between relational memory and hippocampal integrity. In addition, it indicates that memory deficits and hippocampal volume reductions are present in some early psychosis patients.
The findings are limited by several factors. For example, the training performance criteria needed to test for a selective inference deficit limits the generalizability of the results to all psychotic patients. Another limitation was the short mean duration of illness. "Only 7 of 41 patients had a duration of treated illness greater than 6 months, and this factor was not correlated with inference performance," they stated.
Despite the limitations, the current findings "indicate that hippocampal dysfunction, well established in chronic schizophrenia patients, is not present at the onset of psychosis in all patients with early psychosis, providing an opportunity for clinical intervention."
The investigators disclosed having no relevant financial conflicts of interest.
FROM BIOLOGICAL PSYCHIATRY
Major Finding: Relational memory impairment and hippocampal abnormalities are not fully present in patients with early psychosis.
Data Source: Structural MRI data for 41 early psychosis patients were compared with those of 34 healthy controls. Functional MRI data were analyzed for 27 of the early psychosis patients and 30 controls.
Disclosures: The investigators disclosed having no relevant conflicts of interest.
Measles Outbreak: Wake-Up Call for Hospitals, Clinicians
A 2009 measles outbreak linked to an unvaccinated child who was treated in the emergency department of a hospital in southwestern Pennsylvania highlights both the potential for measles transmission in health care settings and the need for clinicians to include the disease in differential diagnoses of patients with fever and rash. Hospitals should document employees’ immunity and be otherwise prepared to limit outbreaks, according to a report released today by the Centers for Disease Control and Prevention in Morbidity and Mortality Weekly Report.
In March of 2009, a physician report to the Pennsylvania Department of Health of a measles case involving an unvaccinated child – followed within 5 days by reports of four additional reported cases, including three unvaccinated children and one physician, who had been in the same community hospital ED on the same day earlier in the month – led to an electronic medical record review to identify the source patient.
The source was also a child (the brother of the initially diagnosed patient) who had recently arrived in the United States with his family from India, and who had been treated previously in the ED for a rash diagnosed as viral exanthema and released, the CDC reported (MMWR 2012;61:30-2).
The discovery led to an extensive and expensive regional search and investigation of the approximately 4,000 individuals who had been in contact with all six of the patients during the incubation period. A review of employee health records to identify exposed personnel without serologic evidence of measles immunity was also conducted. The investigation did not identify any additional measles cases, but did identify 72 of 168 potentially exposed employees with no documented measles immunity who were then required to undergo serologic testing and, if necessary, vaccination, according to the report. Testing found that eight did not have measles IgG antibodies, and of those, five were furloughed for 18 days after exposure.
Of note, the authors wrote in an editorial note, "of the six cases, only the index patient initially was suspected of having measles; therefore, he was the only patient for whom isolation precautions were taken," despite 2007 guidelines issued by the Hospital Infection Control Practices Advisory Committee (HICPAC) recommending precautions against airborne transmission for any patient who has a maculopapular rash accompanied by cough, coryza, and fever. The failure to readily identify potential measles cases may be attributable to U.S. clinicians’ limited experience with the contagious disease – itself an unintended consequence of the high U.S. vaccination coverage levels and the efficacy of the measles-mumps-rubella vaccine, they wrote.
In addition to the public health implications, health care–associated measles outbreaks pose a substantial burden on public health resources and health care facilities, in terms of the time and cost of extensive record reviews, contact tracing, and requisite communications, the authors wrote.
Fortunately, the scope of the Pennsylvania outbreak was limited, possibly because of high rates of measles immunization in the community, the fact that the affected children did not attend school or child care, and intense public health control efforts, they stated.
In addition to improving clinicians’ awareness of measles and the necessary isolation procedures, "all health care facilities should follow [Advisory Committee on Immunization Practices] ACIP and HICPAC guidelines that health-care facilities should ensure that their employees are fully vaccinated from measles or have laboratory evidence of immunity," the authors stressed. "This can minimize the need for emergency testing and furlough of employees exposed to measles and associated outbreaks."
A 2009 measles outbreak linked to an unvaccinated child who was treated in the emergency department of a hospital in southwestern Pennsylvania highlights both the potential for measles transmission in health care settings and the need for clinicians to include the disease in differential diagnoses of patients with fever and rash. Hospitals should document employees’ immunity and be otherwise prepared to limit outbreaks, according to a report released today by the Centers for Disease Control and Prevention in Morbidity and Mortality Weekly Report.
In March of 2009, a physician report to the Pennsylvania Department of Health of a measles case involving an unvaccinated child – followed within 5 days by reports of four additional reported cases, including three unvaccinated children and one physician, who had been in the same community hospital ED on the same day earlier in the month – led to an electronic medical record review to identify the source patient.
The source was also a child (the brother of the initially diagnosed patient) who had recently arrived in the United States with his family from India, and who had been treated previously in the ED for a rash diagnosed as viral exanthema and released, the CDC reported (MMWR 2012;61:30-2).
The discovery led to an extensive and expensive regional search and investigation of the approximately 4,000 individuals who had been in contact with all six of the patients during the incubation period. A review of employee health records to identify exposed personnel without serologic evidence of measles immunity was also conducted. The investigation did not identify any additional measles cases, but did identify 72 of 168 potentially exposed employees with no documented measles immunity who were then required to undergo serologic testing and, if necessary, vaccination, according to the report. Testing found that eight did not have measles IgG antibodies, and of those, five were furloughed for 18 days after exposure.
Of note, the authors wrote in an editorial note, "of the six cases, only the index patient initially was suspected of having measles; therefore, he was the only patient for whom isolation precautions were taken," despite 2007 guidelines issued by the Hospital Infection Control Practices Advisory Committee (HICPAC) recommending precautions against airborne transmission for any patient who has a maculopapular rash accompanied by cough, coryza, and fever. The failure to readily identify potential measles cases may be attributable to U.S. clinicians’ limited experience with the contagious disease – itself an unintended consequence of the high U.S. vaccination coverage levels and the efficacy of the measles-mumps-rubella vaccine, they wrote.
In addition to the public health implications, health care–associated measles outbreaks pose a substantial burden on public health resources and health care facilities, in terms of the time and cost of extensive record reviews, contact tracing, and requisite communications, the authors wrote.
Fortunately, the scope of the Pennsylvania outbreak was limited, possibly because of high rates of measles immunization in the community, the fact that the affected children did not attend school or child care, and intense public health control efforts, they stated.
In addition to improving clinicians’ awareness of measles and the necessary isolation procedures, "all health care facilities should follow [Advisory Committee on Immunization Practices] ACIP and HICPAC guidelines that health-care facilities should ensure that their employees are fully vaccinated from measles or have laboratory evidence of immunity," the authors stressed. "This can minimize the need for emergency testing and furlough of employees exposed to measles and associated outbreaks."
A 2009 measles outbreak linked to an unvaccinated child who was treated in the emergency department of a hospital in southwestern Pennsylvania highlights both the potential for measles transmission in health care settings and the need for clinicians to include the disease in differential diagnoses of patients with fever and rash. Hospitals should document employees’ immunity and be otherwise prepared to limit outbreaks, according to a report released today by the Centers for Disease Control and Prevention in Morbidity and Mortality Weekly Report.
In March of 2009, a physician report to the Pennsylvania Department of Health of a measles case involving an unvaccinated child – followed within 5 days by reports of four additional reported cases, including three unvaccinated children and one physician, who had been in the same community hospital ED on the same day earlier in the month – led to an electronic medical record review to identify the source patient.
The source was also a child (the brother of the initially diagnosed patient) who had recently arrived in the United States with his family from India, and who had been treated previously in the ED for a rash diagnosed as viral exanthema and released, the CDC reported (MMWR 2012;61:30-2).
The discovery led to an extensive and expensive regional search and investigation of the approximately 4,000 individuals who had been in contact with all six of the patients during the incubation period. A review of employee health records to identify exposed personnel without serologic evidence of measles immunity was also conducted. The investigation did not identify any additional measles cases, but did identify 72 of 168 potentially exposed employees with no documented measles immunity who were then required to undergo serologic testing and, if necessary, vaccination, according to the report. Testing found that eight did not have measles IgG antibodies, and of those, five were furloughed for 18 days after exposure.
Of note, the authors wrote in an editorial note, "of the six cases, only the index patient initially was suspected of having measles; therefore, he was the only patient for whom isolation precautions were taken," despite 2007 guidelines issued by the Hospital Infection Control Practices Advisory Committee (HICPAC) recommending precautions against airborne transmission for any patient who has a maculopapular rash accompanied by cough, coryza, and fever. The failure to readily identify potential measles cases may be attributable to U.S. clinicians’ limited experience with the contagious disease – itself an unintended consequence of the high U.S. vaccination coverage levels and the efficacy of the measles-mumps-rubella vaccine, they wrote.
In addition to the public health implications, health care–associated measles outbreaks pose a substantial burden on public health resources and health care facilities, in terms of the time and cost of extensive record reviews, contact tracing, and requisite communications, the authors wrote.
Fortunately, the scope of the Pennsylvania outbreak was limited, possibly because of high rates of measles immunization in the community, the fact that the affected children did not attend school or child care, and intense public health control efforts, they stated.
In addition to improving clinicians’ awareness of measles and the necessary isolation procedures, "all health care facilities should follow [Advisory Committee on Immunization Practices] ACIP and HICPAC guidelines that health-care facilities should ensure that their employees are fully vaccinated from measles or have laboratory evidence of immunity," the authors stressed. "This can minimize the need for emergency testing and furlough of employees exposed to measles and associated outbreaks."
FROM MORBIDITY AND MORTALITY WEEKLY REPORT
Palindromic Rheumatism: The Great Pretender
What looks like rheumatoid arthritis, feels like rheumatoid arthritis, but isn’t rheumatoid arthritis? Palindromic rheumatism.
First described in 1944 as a "new, oft-recurring disease of joints" (Arch. Intern. Med. 1944;73:293-321), palindromic rheumatism is similar to rheumatoid arthritis in that its characteristic features include pain, inflammation, and disability in and around one or multiple joints that lasts from a few hours to several days. Unlike the symptoms of RA, however, these idiopathic symptoms subside completely between episodes with no residual articular effects. The symptom-free periods can last from weeks to months, according to Dr. Carlo Maurizio Montecucco. "The frequency of acute attacks is variable, ranging from less than one every other month to one every other day, and patients rarely present with constitutional symptoms or fever," he said.
In this column, Dr. Montecucco discusses the pertinent diagnostic and treatment considerations for the management of palindromic rheumatism.
Rheumatology News: What are the key considerations that differentiate palindromic rheumatism from RA or other inflammatory joint conditions?
Dr. Montecucco: Differentiation from RA is quite easy based on the medical history and the characteristics of the arthritis. Differential diagnosis may be more difficult, however, with respect to other remitting/recurrent rheumatic complaints such as crystal arthropathies, Behçet’s disease, reactive arthritis, relapsing polychondritis, familial Mediterranean fever, and other autoinflammatory diseases.
RN: How is the condition diagnosed?
Dr. Montecucco: Palindromic rheumatism should be suspected after a several-month history of brief, sudden-onset, and recurrent episodes of monoarthritis or soft-tissue inflammation with three or more joints involved in different attacks and direct observation of one attack by a physician. The condition can be diagnosed after exclusion of other arthritides, in particular crystal deposition diseases. No single test can confirm a diagnosis. Erythrocyte sedimentation rate (ESR) and C-reactive protein (CRP) levels are usually within normal limits or slightly elevated; rheumatoid factor (RF), anti–citrullinated peptide antibodies (ACPAs), and antinuclear antibodies (ANA) may be positive in 30%-60% of the cases. Ultrasonography and magnetic resonance imaging (MRI) may show transient synovitis and subchondral bone edema during an attack, but these features are difficult to catch and not specific. No erosions are present on radiographs.
RN: Does palindromic rheumatism every progress to RA, and if so are there telltale signs to identify which patients might be more likely to develop RA?
Dr. Montecucco: Progression to RA occurs in about one-third to one-half of the patients. The latency period between the onset of palindromic rheumatism and the development of RA is highly variable, ranging from a few weeks to more than 10 years. Most progressors have ACPA in their baseline serum, so that ACPA-positive palindromic rheumatism may be regarded as a prodromic phase of RA. Additional factors associated with the development of RA are RF positivity, involvement of proximal interphalangeal joints or wrist, and female gender.
RN: How is the condition treated?
Dr. Montecucco: No randomized controlled clinical study has been done to date. According to several observational studies, and our experience as well, hydroxychloroquine may be effective in reducing the frequency and severity of the attacks and probably also in delaying the evolution to RA. ACPA-positive patients should stop smoking immediately. Nonsteroidal anti-inflammatory drugs are usually effective during painful attacks.
RN: Are biologic drugs ever part of the treatment protocol?
Dr. Montecucco: There is no evidence suggesting a role for biologic agents in prevention of either recurrent attacks or transition to RA. At present, biologic drugs should be given only to the patients who develop RA, according to the current guidelines for treatment of that condition.
RN: What is the prognosis for individuals diagnosed with palindromic rheumatism?
Dr. Montecucco: The prognosis is dependent on the evolution. Three patterns of disease evolution have been recognized in palindromic rheumatism patients: clinical remission of the attacks in about 10%-15% of the cases, a clinical course of recurrent attacks without persistent joint involvement in 40%-50%, or evolution to a chronic disease in approximately 35%-50%. In the majority of these cases, the chronic disease is RA. Evolution to a seronegative spondyloarthritis, connective tissue disease, or vasculitis is quite uncommon.
Dr. Montecucco is professor of rheumatology and chair of the department of rheumatology at S. Matteo University Hospital, Pavia, Italy. He had no relevant financial conflicts of interest to disclose.
This column, "Ask the Expert," appears regularly in Rheumatology News, a publication of Elsevier.
What looks like rheumatoid arthritis, feels like rheumatoid arthritis, but isn’t rheumatoid arthritis? Palindromic rheumatism.
First described in 1944 as a "new, oft-recurring disease of joints" (Arch. Intern. Med. 1944;73:293-321), palindromic rheumatism is similar to rheumatoid arthritis in that its characteristic features include pain, inflammation, and disability in and around one or multiple joints that lasts from a few hours to several days. Unlike the symptoms of RA, however, these idiopathic symptoms subside completely between episodes with no residual articular effects. The symptom-free periods can last from weeks to months, according to Dr. Carlo Maurizio Montecucco. "The frequency of acute attacks is variable, ranging from less than one every other month to one every other day, and patients rarely present with constitutional symptoms or fever," he said.
In this column, Dr. Montecucco discusses the pertinent diagnostic and treatment considerations for the management of palindromic rheumatism.
Rheumatology News: What are the key considerations that differentiate palindromic rheumatism from RA or other inflammatory joint conditions?
Dr. Montecucco: Differentiation from RA is quite easy based on the medical history and the characteristics of the arthritis. Differential diagnosis may be more difficult, however, with respect to other remitting/recurrent rheumatic complaints such as crystal arthropathies, Behçet’s disease, reactive arthritis, relapsing polychondritis, familial Mediterranean fever, and other autoinflammatory diseases.
RN: How is the condition diagnosed?
Dr. Montecucco: Palindromic rheumatism should be suspected after a several-month history of brief, sudden-onset, and recurrent episodes of monoarthritis or soft-tissue inflammation with three or more joints involved in different attacks and direct observation of one attack by a physician. The condition can be diagnosed after exclusion of other arthritides, in particular crystal deposition diseases. No single test can confirm a diagnosis. Erythrocyte sedimentation rate (ESR) and C-reactive protein (CRP) levels are usually within normal limits or slightly elevated; rheumatoid factor (RF), anti–citrullinated peptide antibodies (ACPAs), and antinuclear antibodies (ANA) may be positive in 30%-60% of the cases. Ultrasonography and magnetic resonance imaging (MRI) may show transient synovitis and subchondral bone edema during an attack, but these features are difficult to catch and not specific. No erosions are present on radiographs.
RN: Does palindromic rheumatism every progress to RA, and if so are there telltale signs to identify which patients might be more likely to develop RA?
Dr. Montecucco: Progression to RA occurs in about one-third to one-half of the patients. The latency period between the onset of palindromic rheumatism and the development of RA is highly variable, ranging from a few weeks to more than 10 years. Most progressors have ACPA in their baseline serum, so that ACPA-positive palindromic rheumatism may be regarded as a prodromic phase of RA. Additional factors associated with the development of RA are RF positivity, involvement of proximal interphalangeal joints or wrist, and female gender.
RN: How is the condition treated?
Dr. Montecucco: No randomized controlled clinical study has been done to date. According to several observational studies, and our experience as well, hydroxychloroquine may be effective in reducing the frequency and severity of the attacks and probably also in delaying the evolution to RA. ACPA-positive patients should stop smoking immediately. Nonsteroidal anti-inflammatory drugs are usually effective during painful attacks.
RN: Are biologic drugs ever part of the treatment protocol?
Dr. Montecucco: There is no evidence suggesting a role for biologic agents in prevention of either recurrent attacks or transition to RA. At present, biologic drugs should be given only to the patients who develop RA, according to the current guidelines for treatment of that condition.
RN: What is the prognosis for individuals diagnosed with palindromic rheumatism?
Dr. Montecucco: The prognosis is dependent on the evolution. Three patterns of disease evolution have been recognized in palindromic rheumatism patients: clinical remission of the attacks in about 10%-15% of the cases, a clinical course of recurrent attacks without persistent joint involvement in 40%-50%, or evolution to a chronic disease in approximately 35%-50%. In the majority of these cases, the chronic disease is RA. Evolution to a seronegative spondyloarthritis, connective tissue disease, or vasculitis is quite uncommon.
Dr. Montecucco is professor of rheumatology and chair of the department of rheumatology at S. Matteo University Hospital, Pavia, Italy. He had no relevant financial conflicts of interest to disclose.
This column, "Ask the Expert," appears regularly in Rheumatology News, a publication of Elsevier.
What looks like rheumatoid arthritis, feels like rheumatoid arthritis, but isn’t rheumatoid arthritis? Palindromic rheumatism.
First described in 1944 as a "new, oft-recurring disease of joints" (Arch. Intern. Med. 1944;73:293-321), palindromic rheumatism is similar to rheumatoid arthritis in that its characteristic features include pain, inflammation, and disability in and around one or multiple joints that lasts from a few hours to several days. Unlike the symptoms of RA, however, these idiopathic symptoms subside completely between episodes with no residual articular effects. The symptom-free periods can last from weeks to months, according to Dr. Carlo Maurizio Montecucco. "The frequency of acute attacks is variable, ranging from less than one every other month to one every other day, and patients rarely present with constitutional symptoms or fever," he said.
In this column, Dr. Montecucco discusses the pertinent diagnostic and treatment considerations for the management of palindromic rheumatism.
Rheumatology News: What are the key considerations that differentiate palindromic rheumatism from RA or other inflammatory joint conditions?
Dr. Montecucco: Differentiation from RA is quite easy based on the medical history and the characteristics of the arthritis. Differential diagnosis may be more difficult, however, with respect to other remitting/recurrent rheumatic complaints such as crystal arthropathies, Behçet’s disease, reactive arthritis, relapsing polychondritis, familial Mediterranean fever, and other autoinflammatory diseases.
RN: How is the condition diagnosed?
Dr. Montecucco: Palindromic rheumatism should be suspected after a several-month history of brief, sudden-onset, and recurrent episodes of monoarthritis or soft-tissue inflammation with three or more joints involved in different attacks and direct observation of one attack by a physician. The condition can be diagnosed after exclusion of other arthritides, in particular crystal deposition diseases. No single test can confirm a diagnosis. Erythrocyte sedimentation rate (ESR) and C-reactive protein (CRP) levels are usually within normal limits or slightly elevated; rheumatoid factor (RF), anti–citrullinated peptide antibodies (ACPAs), and antinuclear antibodies (ANA) may be positive in 30%-60% of the cases. Ultrasonography and magnetic resonance imaging (MRI) may show transient synovitis and subchondral bone edema during an attack, but these features are difficult to catch and not specific. No erosions are present on radiographs.
RN: Does palindromic rheumatism every progress to RA, and if so are there telltale signs to identify which patients might be more likely to develop RA?
Dr. Montecucco: Progression to RA occurs in about one-third to one-half of the patients. The latency period between the onset of palindromic rheumatism and the development of RA is highly variable, ranging from a few weeks to more than 10 years. Most progressors have ACPA in their baseline serum, so that ACPA-positive palindromic rheumatism may be regarded as a prodromic phase of RA. Additional factors associated with the development of RA are RF positivity, involvement of proximal interphalangeal joints or wrist, and female gender.
RN: How is the condition treated?
Dr. Montecucco: No randomized controlled clinical study has been done to date. According to several observational studies, and our experience as well, hydroxychloroquine may be effective in reducing the frequency and severity of the attacks and probably also in delaying the evolution to RA. ACPA-positive patients should stop smoking immediately. Nonsteroidal anti-inflammatory drugs are usually effective during painful attacks.
RN: Are biologic drugs ever part of the treatment protocol?
Dr. Montecucco: There is no evidence suggesting a role for biologic agents in prevention of either recurrent attacks or transition to RA. At present, biologic drugs should be given only to the patients who develop RA, according to the current guidelines for treatment of that condition.
RN: What is the prognosis for individuals diagnosed with palindromic rheumatism?
Dr. Montecucco: The prognosis is dependent on the evolution. Three patterns of disease evolution have been recognized in palindromic rheumatism patients: clinical remission of the attacks in about 10%-15% of the cases, a clinical course of recurrent attacks without persistent joint involvement in 40%-50%, or evolution to a chronic disease in approximately 35%-50%. In the majority of these cases, the chronic disease is RA. Evolution to a seronegative spondyloarthritis, connective tissue disease, or vasculitis is quite uncommon.
Dr. Montecucco is professor of rheumatology and chair of the department of rheumatology at S. Matteo University Hospital, Pavia, Italy. He had no relevant financial conflicts of interest to disclose.
This column, "Ask the Expert," appears regularly in Rheumatology News, a publication of Elsevier.
Palindromic Rheumatism: The Great Pretender
What looks like rheumatoid arthritis, feels like rheumatoid arthritis, but isn’t rheumatoid arthritis? Palindromic rheumatism.
First described in 1944 as a "new, oft-recurring disease of joints" (Arch. Intern. Med. 1944;73:293-321), palindromic rheumatism is similar to rheumatoid arthritis in that its characteristic features include pain, inflammation, and disability in and around one or multiple joints that lasts from a few hours to several days. Unlike the symptoms of RA, however, these idiopathic symptoms subside completely between episodes with no residual articular effects. The symptom-free periods can last from weeks to months, according to Dr. Carlo Maurizio Montecucco. "The frequency of acute attacks is variable, ranging from less than one every other month to one every other day, and patients rarely present with constitutional symptoms or fever," he said.
In this column, Dr. Montecucco discusses the pertinent diagnostic and treatment considerations for the management of palindromic rheumatism.
Rheumatology News: What are the key considerations that differentiate palindromic rheumatism from RA or other inflammatory joint conditions?
Dr. Montecucco: Differentiation from RA is quite easy based on the medical history and the characteristics of the arthritis. Differential diagnosis may be more difficult, however, with respect to other remitting/recurrent rheumatic complaints such as crystal arthropathies, Behçet’s disease, reactive arthritis, relapsing polychondritis, familial Mediterranean fever, and other autoinflammatory diseases.
RN: How is the condition diagnosed?
Dr. Montecucco: Palindromic rheumatism should be suspected after a several-month history of brief, sudden-onset, and recurrent episodes of monoarthritis or soft-tissue inflammation with three or more joints involved in different attacks and direct observation of one attack by a physician. The condition can be diagnosed after exclusion of other arthritides, in particular crystal deposition diseases. No single test can confirm a diagnosis. Erythrocyte sedimentation rate (ESR) and C-reactive protein (CRP) levels are usually within normal limits or slightly elevated; rheumatoid factor (RF), anti–citrullinated peptide antibodies (ACPAs), and antinuclear antibodies (ANA) may be positive in 30%-60% of the cases. Ultrasonography and magnetic resonance imaging (MRI) may show transient synovitis and subchondral bone edema during an attack, but these features are difficult to catch and not specific. No erosions are present on radiographs.
RN: Does palindromic rheumatism every progress to RA, and if so are there telltale signs to identify which patients might be more likely to develop RA?
Dr. Montecucco: Progression to RA occurs in about one-third to one-half of the patients. The latency period between the onset of palindromic rheumatism and the development of RA is highly variable, ranging from a few weeks to more than 10 years. Most progressors have ACPA in their baseline serum, so that ACPA-positive palindromic rheumatism may be regarded as a prodromic phase of RA. Additional factors associated with the development of RA are RF positivity, involvement of proximal interphalangeal joints or wrist, and female gender.
RN: How is the condition treated?
Dr. Montecucco: No randomized controlled clinical study has been done to date. According to several observational studies, and our experience as well, hydroxychloroquine may be effective in reducing the frequency and severity of the attacks and probably also in delaying the evolution to RA. ACPA-positive patients should stop smoking immediately. Nonsteroidal anti-inflammatory drugs are usually effective during painful attacks.
RN: Are biologic drugs ever part of the treatment protocol?
Dr. Montecucco: There is no evidence suggesting a role for biologic agents in prevention of either recurrent attacks or transition to RA. At present, biologic drugs should be given only to the patients who develop RA, according to the current guidelines for treatment of that condition.
RN: What is the prognosis for individuals diagnosed with palindromic rheumatism?
Dr. Montecucco: The prognosis is dependent on the evolution. Three patterns of disease evolution have been recognized in palindromic rheumatism patients: clinical remission of the attacks in about 10%-15% of the cases, a clinical course of recurrent attacks without persistent joint involvement in 40%-50%, or evolution to a chronic disease in approximately 35%-50%. In the majority of these cases, the chronic disease is RA. Evolution to a seronegative spondyloarthritis, connective tissue disease, or vasculitis is quite uncommon.
Dr. Montecucco is professor of rheumatology and chair of the department of rheumatology at S. Matteo University Hospital, Pavia, Italy. He had no relevant financial conflicts of interest to disclose.
This column, "Ask the Expert," appears regularly in Rheumatology News, a publication of Elsevier.
What looks like rheumatoid arthritis, feels like rheumatoid arthritis, but isn’t rheumatoid arthritis? Palindromic rheumatism.
First described in 1944 as a "new, oft-recurring disease of joints" (Arch. Intern. Med. 1944;73:293-321), palindromic rheumatism is similar to rheumatoid arthritis in that its characteristic features include pain, inflammation, and disability in and around one or multiple joints that lasts from a few hours to several days. Unlike the symptoms of RA, however, these idiopathic symptoms subside completely between episodes with no residual articular effects. The symptom-free periods can last from weeks to months, according to Dr. Carlo Maurizio Montecucco. "The frequency of acute attacks is variable, ranging from less than one every other month to one every other day, and patients rarely present with constitutional symptoms or fever," he said.
In this column, Dr. Montecucco discusses the pertinent diagnostic and treatment considerations for the management of palindromic rheumatism.
Rheumatology News: What are the key considerations that differentiate palindromic rheumatism from RA or other inflammatory joint conditions?
Dr. Montecucco: Differentiation from RA is quite easy based on the medical history and the characteristics of the arthritis. Differential diagnosis may be more difficult, however, with respect to other remitting/recurrent rheumatic complaints such as crystal arthropathies, Behçet’s disease, reactive arthritis, relapsing polychondritis, familial Mediterranean fever, and other autoinflammatory diseases.
RN: How is the condition diagnosed?
Dr. Montecucco: Palindromic rheumatism should be suspected after a several-month history of brief, sudden-onset, and recurrent episodes of monoarthritis or soft-tissue inflammation with three or more joints involved in different attacks and direct observation of one attack by a physician. The condition can be diagnosed after exclusion of other arthritides, in particular crystal deposition diseases. No single test can confirm a diagnosis. Erythrocyte sedimentation rate (ESR) and C-reactive protein (CRP) levels are usually within normal limits or slightly elevated; rheumatoid factor (RF), anti–citrullinated peptide antibodies (ACPAs), and antinuclear antibodies (ANA) may be positive in 30%-60% of the cases. Ultrasonography and magnetic resonance imaging (MRI) may show transient synovitis and subchondral bone edema during an attack, but these features are difficult to catch and not specific. No erosions are present on radiographs.
RN: Does palindromic rheumatism every progress to RA, and if so are there telltale signs to identify which patients might be more likely to develop RA?
Dr. Montecucco: Progression to RA occurs in about one-third to one-half of the patients. The latency period between the onset of palindromic rheumatism and the development of RA is highly variable, ranging from a few weeks to more than 10 years. Most progressors have ACPA in their baseline serum, so that ACPA-positive palindromic rheumatism may be regarded as a prodromic phase of RA. Additional factors associated with the development of RA are RF positivity, involvement of proximal interphalangeal joints or wrist, and female gender.
RN: How is the condition treated?
Dr. Montecucco: No randomized controlled clinical study has been done to date. According to several observational studies, and our experience as well, hydroxychloroquine may be effective in reducing the frequency and severity of the attacks and probably also in delaying the evolution to RA. ACPA-positive patients should stop smoking immediately. Nonsteroidal anti-inflammatory drugs are usually effective during painful attacks.
RN: Are biologic drugs ever part of the treatment protocol?
Dr. Montecucco: There is no evidence suggesting a role for biologic agents in prevention of either recurrent attacks or transition to RA. At present, biologic drugs should be given only to the patients who develop RA, according to the current guidelines for treatment of that condition.
RN: What is the prognosis for individuals diagnosed with palindromic rheumatism?
Dr. Montecucco: The prognosis is dependent on the evolution. Three patterns of disease evolution have been recognized in palindromic rheumatism patients: clinical remission of the attacks in about 10%-15% of the cases, a clinical course of recurrent attacks without persistent joint involvement in 40%-50%, or evolution to a chronic disease in approximately 35%-50%. In the majority of these cases, the chronic disease is RA. Evolution to a seronegative spondyloarthritis, connective tissue disease, or vasculitis is quite uncommon.
Dr. Montecucco is professor of rheumatology and chair of the department of rheumatology at S. Matteo University Hospital, Pavia, Italy. He had no relevant financial conflicts of interest to disclose.
This column, "Ask the Expert," appears regularly in Rheumatology News, a publication of Elsevier.
What looks like rheumatoid arthritis, feels like rheumatoid arthritis, but isn’t rheumatoid arthritis? Palindromic rheumatism.
First described in 1944 as a "new, oft-recurring disease of joints" (Arch. Intern. Med. 1944;73:293-321), palindromic rheumatism is similar to rheumatoid arthritis in that its characteristic features include pain, inflammation, and disability in and around one or multiple joints that lasts from a few hours to several days. Unlike the symptoms of RA, however, these idiopathic symptoms subside completely between episodes with no residual articular effects. The symptom-free periods can last from weeks to months, according to Dr. Carlo Maurizio Montecucco. "The frequency of acute attacks is variable, ranging from less than one every other month to one every other day, and patients rarely present with constitutional symptoms or fever," he said.
In this column, Dr. Montecucco discusses the pertinent diagnostic and treatment considerations for the management of palindromic rheumatism.
Rheumatology News: What are the key considerations that differentiate palindromic rheumatism from RA or other inflammatory joint conditions?
Dr. Montecucco: Differentiation from RA is quite easy based on the medical history and the characteristics of the arthritis. Differential diagnosis may be more difficult, however, with respect to other remitting/recurrent rheumatic complaints such as crystal arthropathies, Behçet’s disease, reactive arthritis, relapsing polychondritis, familial Mediterranean fever, and other autoinflammatory diseases.
RN: How is the condition diagnosed?
Dr. Montecucco: Palindromic rheumatism should be suspected after a several-month history of brief, sudden-onset, and recurrent episodes of monoarthritis or soft-tissue inflammation with three or more joints involved in different attacks and direct observation of one attack by a physician. The condition can be diagnosed after exclusion of other arthritides, in particular crystal deposition diseases. No single test can confirm a diagnosis. Erythrocyte sedimentation rate (ESR) and C-reactive protein (CRP) levels are usually within normal limits or slightly elevated; rheumatoid factor (RF), anti–citrullinated peptide antibodies (ACPAs), and antinuclear antibodies (ANA) may be positive in 30%-60% of the cases. Ultrasonography and magnetic resonance imaging (MRI) may show transient synovitis and subchondral bone edema during an attack, but these features are difficult to catch and not specific. No erosions are present on radiographs.
RN: Does palindromic rheumatism every progress to RA, and if so are there telltale signs to identify which patients might be more likely to develop RA?
Dr. Montecucco: Progression to RA occurs in about one-third to one-half of the patients. The latency period between the onset of palindromic rheumatism and the development of RA is highly variable, ranging from a few weeks to more than 10 years. Most progressors have ACPA in their baseline serum, so that ACPA-positive palindromic rheumatism may be regarded as a prodromic phase of RA. Additional factors associated with the development of RA are RF positivity, involvement of proximal interphalangeal joints or wrist, and female gender.
RN: How is the condition treated?
Dr. Montecucco: No randomized controlled clinical study has been done to date. According to several observational studies, and our experience as well, hydroxychloroquine may be effective in reducing the frequency and severity of the attacks and probably also in delaying the evolution to RA. ACPA-positive patients should stop smoking immediately. Nonsteroidal anti-inflammatory drugs are usually effective during painful attacks.
RN: Are biologic drugs ever part of the treatment protocol?
Dr. Montecucco: There is no evidence suggesting a role for biologic agents in prevention of either recurrent attacks or transition to RA. At present, biologic drugs should be given only to the patients who develop RA, according to the current guidelines for treatment of that condition.
RN: What is the prognosis for individuals diagnosed with palindromic rheumatism?
Dr. Montecucco: The prognosis is dependent on the evolution. Three patterns of disease evolution have been recognized in palindromic rheumatism patients: clinical remission of the attacks in about 10%-15% of the cases, a clinical course of recurrent attacks without persistent joint involvement in 40%-50%, or evolution to a chronic disease in approximately 35%-50%. In the majority of these cases, the chronic disease is RA. Evolution to a seronegative spondyloarthritis, connective tissue disease, or vasculitis is quite uncommon.
Dr. Montecucco is professor of rheumatology and chair of the department of rheumatology at S. Matteo University Hospital, Pavia, Italy. He had no relevant financial conflicts of interest to disclose.
This column, "Ask the Expert," appears regularly in Rheumatology News, a publication of Elsevier.
Biomarkers May Improve Lung Cancer Screening
New noninvasive screening technologies are poised to improve the diagnostic yield of advanced imaging in lung cancer and, by so doing, improve patient outcomes, according to Dr. Paul A. Bunn.
A blood test for detecting genetic mutations in circulating tumor cells of lung cancer specimens and a colorimetric sensor array that identifies cancerous compounds in exhaled human breath are among the technologies that could lead to earlier diagnosis and treatment, said Dr. Bunn, executive director of the International Association for the Study of Lung Cancer (IASLC).
Lung cancer treatment has been hampered in the past by late diagnoses, typically achieved using invasive procedures only after symptoms have presented, said Dr. Bunn, the James Dudley Professor of Lung Cancer Research at the University of Colorado at Denver.
"But this is changing quickly," he said. "Major breakthroughs are leading to interventions that make a huge difference and make it an exciting time for lung cancer."
The first such breakthrough has been the use of low-dose helical computed tomography, which can identify early-stage disease in asymptomatic individuals while exposing them to a fraction of the radiation emitted by a standard diagnostic chest CT or x-ray, Dr. Bunn said in a press briefing on research presented at a joint conference of the American Association for Cancer Research and the IASLC.
"Spiral CT scans reduced lung cancer mortality by 20% [among current or former heavy smokers] and increased the 7-year survival rate by 20% compared with standard chest x-rays," he said, citing preliminary results of the National Lung Screening Trial (NLST) (N. Engl. J. Med. 2011;365:395-409).
"The low-dose CT screening also increased the diagnosis of stage I cases and surgical cures while they decreased the number of stage IV diagnoses because patients were diagnosed earlier and cured," he said.
Unfortunately, the value of CT scans as a routine screening tool is limited by the technology’s low specificity. In the NLST study, approximately 24% of the participants screened positive based on abnormal CT scan findings, but only 4% of the abnormalities were confirmed as lung cancer. This has led to controversy over whether smokers should be routinely screened for lung cancer.
"The remaining 96% were false positives," said Dr. Bunn, who maintained that the technology, on its own, is currently not cost effective enough to recommend for routine annual screening. "Working up those nodules is incredibly expensive and complicated, and often leads to surgery for something that is benign, not malignant," he said.
The cost/benefit ratio stands to improve substantially, however, as some of the noninvasive screening technologies presented at the conference come to fruition, Dr. Bunn predicted.
For example, Heidi S. Erickson, Ph.D., and her colleagues at the University of Texas M.D. Anderson Cancer Center in Houston have developed a highly sensitive method for detecting cancer mutations in DNA isolated from circulating tumor cells of non–small cell lung cancer (NSCLC). They use the mass spectrometry–based technology to look for any of 135 mutations among 13 genes representing multiple pathways known to be involved in lung cancer.
The methodology requires a simple blood test, which makes it less intrusive than a biopsy. The information will ultimately help investigators understand the molecular characteristics of lung cancer treatment and progression, Dr. Erickson said in an interview. When perfected, it will also complement the information attained via spiral CT scans by providing important insight into diagnostic, prognostic, and predictive markers of disease, thus aiding management decisions, she said.
Similarly, a test identifying lung cancer biomarkers through exhaled breath may also help clinicians and researchers identify which patients with abnormal CT scans need more aggressive follow-up, according to Dr. Bunn.
Dr. Nir Peled of the Sheba Medical Center in Tel Hashomer, Israel, presented data from a cross-sectional comparative survey using breath analyses, in which investigators captured the "metabolic biosignatures" – the pattern of volatile organic compounds (VOCs) – of 74 patients with solitary pulmonary nodules to determine the VOC profiles for malignant and benign lung nodules.
For the analyses, a patient’s breath is drawn across an array of nanomaterial-based sensors, and the patterns are captured using digital cameras. Of the 74 high-risk patients, 53 had malignant nodules and 21 had benign growths, Dr. Peled reported. Within the malignant group, 47 samples were NSCLC and 6 were small cell lung cancer. Further, 30 of the patients had early-stage (I/II) disease and 23 had advanced disease (stage III/V), he said.
"On analysis, two [VOCs] in patients’ exhaled breath showed statistically significant differences in concentration for benign and malignant lung nodules, and the sensor array distinguished between the corresponding collective VOC patterns with nearly 90% accuracy," Dr. Peled said in an interview.
Further, looking specifically at the malignant nodules, "the sensor array distinguished between small and non–small cell lung cancer with an accuracy approaching 94% and between early and advanced disease with nearly 90% accuracy."
Although the test is a work in progress and not yet ready for clinical application, the findings suggest the possibility that a noninvasive breath analysis tool would be a useful, cost-effective diagnostic tool for managing nodule-positive patients, Dr. Bunn stressed. "We make advances one step at a time, and these are first steps, but they’re important," he said.
Dr. Bunn disclosed financial relationships with numerous pharmaceutical companies. Dr. Peled said she had no relevant financial disclosures. No disclosures were received from Dr. Erickson.
New noninvasive screening technologies are poised to improve the diagnostic yield of advanced imaging in lung cancer and, by so doing, improve patient outcomes, according to Dr. Paul A. Bunn.
A blood test for detecting genetic mutations in circulating tumor cells of lung cancer specimens and a colorimetric sensor array that identifies cancerous compounds in exhaled human breath are among the technologies that could lead to earlier diagnosis and treatment, said Dr. Bunn, executive director of the International Association for the Study of Lung Cancer (IASLC).
Lung cancer treatment has been hampered in the past by late diagnoses, typically achieved using invasive procedures only after symptoms have presented, said Dr. Bunn, the James Dudley Professor of Lung Cancer Research at the University of Colorado at Denver.
"But this is changing quickly," he said. "Major breakthroughs are leading to interventions that make a huge difference and make it an exciting time for lung cancer."
The first such breakthrough has been the use of low-dose helical computed tomography, which can identify early-stage disease in asymptomatic individuals while exposing them to a fraction of the radiation emitted by a standard diagnostic chest CT or x-ray, Dr. Bunn said in a press briefing on research presented at a joint conference of the American Association for Cancer Research and the IASLC.
"Spiral CT scans reduced lung cancer mortality by 20% [among current or former heavy smokers] and increased the 7-year survival rate by 20% compared with standard chest x-rays," he said, citing preliminary results of the National Lung Screening Trial (NLST) (N. Engl. J. Med. 2011;365:395-409).
"The low-dose CT screening also increased the diagnosis of stage I cases and surgical cures while they decreased the number of stage IV diagnoses because patients were diagnosed earlier and cured," he said.
Unfortunately, the value of CT scans as a routine screening tool is limited by the technology’s low specificity. In the NLST study, approximately 24% of the participants screened positive based on abnormal CT scan findings, but only 4% of the abnormalities were confirmed as lung cancer. This has led to controversy over whether smokers should be routinely screened for lung cancer.
"The remaining 96% were false positives," said Dr. Bunn, who maintained that the technology, on its own, is currently not cost effective enough to recommend for routine annual screening. "Working up those nodules is incredibly expensive and complicated, and often leads to surgery for something that is benign, not malignant," he said.
The cost/benefit ratio stands to improve substantially, however, as some of the noninvasive screening technologies presented at the conference come to fruition, Dr. Bunn predicted.
For example, Heidi S. Erickson, Ph.D., and her colleagues at the University of Texas M.D. Anderson Cancer Center in Houston have developed a highly sensitive method for detecting cancer mutations in DNA isolated from circulating tumor cells of non–small cell lung cancer (NSCLC). They use the mass spectrometry–based technology to look for any of 135 mutations among 13 genes representing multiple pathways known to be involved in lung cancer.
The methodology requires a simple blood test, which makes it less intrusive than a biopsy. The information will ultimately help investigators understand the molecular characteristics of lung cancer treatment and progression, Dr. Erickson said in an interview. When perfected, it will also complement the information attained via spiral CT scans by providing important insight into diagnostic, prognostic, and predictive markers of disease, thus aiding management decisions, she said.
Similarly, a test identifying lung cancer biomarkers through exhaled breath may also help clinicians and researchers identify which patients with abnormal CT scans need more aggressive follow-up, according to Dr. Bunn.
Dr. Nir Peled of the Sheba Medical Center in Tel Hashomer, Israel, presented data from a cross-sectional comparative survey using breath analyses, in which investigators captured the "metabolic biosignatures" – the pattern of volatile organic compounds (VOCs) – of 74 patients with solitary pulmonary nodules to determine the VOC profiles for malignant and benign lung nodules.
For the analyses, a patient’s breath is drawn across an array of nanomaterial-based sensors, and the patterns are captured using digital cameras. Of the 74 high-risk patients, 53 had malignant nodules and 21 had benign growths, Dr. Peled reported. Within the malignant group, 47 samples were NSCLC and 6 were small cell lung cancer. Further, 30 of the patients had early-stage (I/II) disease and 23 had advanced disease (stage III/V), he said.
"On analysis, two [VOCs] in patients’ exhaled breath showed statistically significant differences in concentration for benign and malignant lung nodules, and the sensor array distinguished between the corresponding collective VOC patterns with nearly 90% accuracy," Dr. Peled said in an interview.
Further, looking specifically at the malignant nodules, "the sensor array distinguished between small and non–small cell lung cancer with an accuracy approaching 94% and between early and advanced disease with nearly 90% accuracy."
Although the test is a work in progress and not yet ready for clinical application, the findings suggest the possibility that a noninvasive breath analysis tool would be a useful, cost-effective diagnostic tool for managing nodule-positive patients, Dr. Bunn stressed. "We make advances one step at a time, and these are first steps, but they’re important," he said.
Dr. Bunn disclosed financial relationships with numerous pharmaceutical companies. Dr. Peled said she had no relevant financial disclosures. No disclosures were received from Dr. Erickson.
New noninvasive screening technologies are poised to improve the diagnostic yield of advanced imaging in lung cancer and, by so doing, improve patient outcomes, according to Dr. Paul A. Bunn.
A blood test for detecting genetic mutations in circulating tumor cells of lung cancer specimens and a colorimetric sensor array that identifies cancerous compounds in exhaled human breath are among the technologies that could lead to earlier diagnosis and treatment, said Dr. Bunn, executive director of the International Association for the Study of Lung Cancer (IASLC).
Lung cancer treatment has been hampered in the past by late diagnoses, typically achieved using invasive procedures only after symptoms have presented, said Dr. Bunn, the James Dudley Professor of Lung Cancer Research at the University of Colorado at Denver.
"But this is changing quickly," he said. "Major breakthroughs are leading to interventions that make a huge difference and make it an exciting time for lung cancer."
The first such breakthrough has been the use of low-dose helical computed tomography, which can identify early-stage disease in asymptomatic individuals while exposing them to a fraction of the radiation emitted by a standard diagnostic chest CT or x-ray, Dr. Bunn said in a press briefing on research presented at a joint conference of the American Association for Cancer Research and the IASLC.
"Spiral CT scans reduced lung cancer mortality by 20% [among current or former heavy smokers] and increased the 7-year survival rate by 20% compared with standard chest x-rays," he said, citing preliminary results of the National Lung Screening Trial (NLST) (N. Engl. J. Med. 2011;365:395-409).
"The low-dose CT screening also increased the diagnosis of stage I cases and surgical cures while they decreased the number of stage IV diagnoses because patients were diagnosed earlier and cured," he said.
Unfortunately, the value of CT scans as a routine screening tool is limited by the technology’s low specificity. In the NLST study, approximately 24% of the participants screened positive based on abnormal CT scan findings, but only 4% of the abnormalities were confirmed as lung cancer. This has led to controversy over whether smokers should be routinely screened for lung cancer.
"The remaining 96% were false positives," said Dr. Bunn, who maintained that the technology, on its own, is currently not cost effective enough to recommend for routine annual screening. "Working up those nodules is incredibly expensive and complicated, and often leads to surgery for something that is benign, not malignant," he said.
The cost/benefit ratio stands to improve substantially, however, as some of the noninvasive screening technologies presented at the conference come to fruition, Dr. Bunn predicted.
For example, Heidi S. Erickson, Ph.D., and her colleagues at the University of Texas M.D. Anderson Cancer Center in Houston have developed a highly sensitive method for detecting cancer mutations in DNA isolated from circulating tumor cells of non–small cell lung cancer (NSCLC). They use the mass spectrometry–based technology to look for any of 135 mutations among 13 genes representing multiple pathways known to be involved in lung cancer.
The methodology requires a simple blood test, which makes it less intrusive than a biopsy. The information will ultimately help investigators understand the molecular characteristics of lung cancer treatment and progression, Dr. Erickson said in an interview. When perfected, it will also complement the information attained via spiral CT scans by providing important insight into diagnostic, prognostic, and predictive markers of disease, thus aiding management decisions, she said.
Similarly, a test identifying lung cancer biomarkers through exhaled breath may also help clinicians and researchers identify which patients with abnormal CT scans need more aggressive follow-up, according to Dr. Bunn.
Dr. Nir Peled of the Sheba Medical Center in Tel Hashomer, Israel, presented data from a cross-sectional comparative survey using breath analyses, in which investigators captured the "metabolic biosignatures" – the pattern of volatile organic compounds (VOCs) – of 74 patients with solitary pulmonary nodules to determine the VOC profiles for malignant and benign lung nodules.
For the analyses, a patient’s breath is drawn across an array of nanomaterial-based sensors, and the patterns are captured using digital cameras. Of the 74 high-risk patients, 53 had malignant nodules and 21 had benign growths, Dr. Peled reported. Within the malignant group, 47 samples were NSCLC and 6 were small cell lung cancer. Further, 30 of the patients had early-stage (I/II) disease and 23 had advanced disease (stage III/V), he said.
"On analysis, two [VOCs] in patients’ exhaled breath showed statistically significant differences in concentration for benign and malignant lung nodules, and the sensor array distinguished between the corresponding collective VOC patterns with nearly 90% accuracy," Dr. Peled said in an interview.
Further, looking specifically at the malignant nodules, "the sensor array distinguished between small and non–small cell lung cancer with an accuracy approaching 94% and between early and advanced disease with nearly 90% accuracy."
Although the test is a work in progress and not yet ready for clinical application, the findings suggest the possibility that a noninvasive breath analysis tool would be a useful, cost-effective diagnostic tool for managing nodule-positive patients, Dr. Bunn stressed. "We make advances one step at a time, and these are first steps, but they’re important," he said.
Dr. Bunn disclosed financial relationships with numerous pharmaceutical companies. Dr. Peled said she had no relevant financial disclosures. No disclosures were received from Dr. Erickson.
FROM A PRESS BRIEFING