User login
Updated Guidelines for Management of Non-ST-Elevation Acute Coronary Syndrome
Clinical question: What is the recommended approach for management of non-ST-elevation acute coronary syndrome (NSTE-ACS)?
Background: This is the first comprehensive update from the American Heart Association/American College of Cardiology (AHA/ACC) on NSTE-ACS since 2007 and follows a focused update published in 2012.
Synopsis: This guideline provides recommendations for acute and long-term care of patients with NSTE-ACS.
Cardiac-specific troponin assays (troponin I or T) are the mainstay for ACS diagnosis. When contemporary troponin assays are used for diagnosis, other biomarkers (CK-MB, myoglobin) are not useful.
Initial hospital care for all patients with NSTE-ACS should include early initiation of beta-blockers (within the first 24 hours), high-intensity statin therapy, P2Y12 inhibitor (clopidogrel or ticagrelor) plus aspirin, and parenteral anticoagulation.
An early invasive strategy (diagnostic angiography within 24 hours with intent to perform revascularization based on coronary anatomy) is preferred to an ischemia-guided strategy, particularly in high-risk NSTE-ACS patients (Global Registry of Acute Coronary Events [GRACE] score >140).
“Ischemia-guided” strategy replaces the term “conservative management strategy,” and its focus on aggressive medical therapy is an option in selected low-risk patient populations (e.g. thrombolysis in myocardial infarction [TIMI] risk score 0 or 1, GRACE score <109, low-risk troponin-negative females). Patients managed with an ischemia-guided strategy should undergo pre-discharge noninvasive stress testing for further risk stratification.
Regardless of angiography strategy (invasive vs. ischemia-guided), post-discharge dual antiplatelet therapy (clopidogrel or ticagrelor) is recommended for up to 12 months in all patients with NSTE-ACS. Prasugrel is an appropriate P2Y12 inhibitor option for patients following percutaneous coronary intervention with stent placement.
All patients with NSTE-ACS should be referred to an outpatient comprehensive cardiovascular rehabilitation program.
Clinical question: What is the recommended approach for management of non-ST-elevation acute coronary syndrome (NSTE-ACS)?
Background: This is the first comprehensive update from the American Heart Association/American College of Cardiology (AHA/ACC) on NSTE-ACS since 2007 and follows a focused update published in 2012.
Synopsis: This guideline provides recommendations for acute and long-term care of patients with NSTE-ACS.
Cardiac-specific troponin assays (troponin I or T) are the mainstay for ACS diagnosis. When contemporary troponin assays are used for diagnosis, other biomarkers (CK-MB, myoglobin) are not useful.
Initial hospital care for all patients with NSTE-ACS should include early initiation of beta-blockers (within the first 24 hours), high-intensity statin therapy, P2Y12 inhibitor (clopidogrel or ticagrelor) plus aspirin, and parenteral anticoagulation.
An early invasive strategy (diagnostic angiography within 24 hours with intent to perform revascularization based on coronary anatomy) is preferred to an ischemia-guided strategy, particularly in high-risk NSTE-ACS patients (Global Registry of Acute Coronary Events [GRACE] score >140).
“Ischemia-guided” strategy replaces the term “conservative management strategy,” and its focus on aggressive medical therapy is an option in selected low-risk patient populations (e.g. thrombolysis in myocardial infarction [TIMI] risk score 0 or 1, GRACE score <109, low-risk troponin-negative females). Patients managed with an ischemia-guided strategy should undergo pre-discharge noninvasive stress testing for further risk stratification.
Regardless of angiography strategy (invasive vs. ischemia-guided), post-discharge dual antiplatelet therapy (clopidogrel or ticagrelor) is recommended for up to 12 months in all patients with NSTE-ACS. Prasugrel is an appropriate P2Y12 inhibitor option for patients following percutaneous coronary intervention with stent placement.
All patients with NSTE-ACS should be referred to an outpatient comprehensive cardiovascular rehabilitation program.
Clinical question: What is the recommended approach for management of non-ST-elevation acute coronary syndrome (NSTE-ACS)?
Background: This is the first comprehensive update from the American Heart Association/American College of Cardiology (AHA/ACC) on NSTE-ACS since 2007 and follows a focused update published in 2012.
Synopsis: This guideline provides recommendations for acute and long-term care of patients with NSTE-ACS.
Cardiac-specific troponin assays (troponin I or T) are the mainstay for ACS diagnosis. When contemporary troponin assays are used for diagnosis, other biomarkers (CK-MB, myoglobin) are not useful.
Initial hospital care for all patients with NSTE-ACS should include early initiation of beta-blockers (within the first 24 hours), high-intensity statin therapy, P2Y12 inhibitor (clopidogrel or ticagrelor) plus aspirin, and parenteral anticoagulation.
An early invasive strategy (diagnostic angiography within 24 hours with intent to perform revascularization based on coronary anatomy) is preferred to an ischemia-guided strategy, particularly in high-risk NSTE-ACS patients (Global Registry of Acute Coronary Events [GRACE] score >140).
“Ischemia-guided” strategy replaces the term “conservative management strategy,” and its focus on aggressive medical therapy is an option in selected low-risk patient populations (e.g. thrombolysis in myocardial infarction [TIMI] risk score 0 or 1, GRACE score <109, low-risk troponin-negative females). Patients managed with an ischemia-guided strategy should undergo pre-discharge noninvasive stress testing for further risk stratification.
Regardless of angiography strategy (invasive vs. ischemia-guided), post-discharge dual antiplatelet therapy (clopidogrel or ticagrelor) is recommended for up to 12 months in all patients with NSTE-ACS. Prasugrel is an appropriate P2Y12 inhibitor option for patients following percutaneous coronary intervention with stent placement.
All patients with NSTE-ACS should be referred to an outpatient comprehensive cardiovascular rehabilitation program.
New Guidelines for Platelet Transfusions in Adults
Clinical question: What is the recommended approach to platelet transfusion in several common clinical scenarios?
Background: The AABB (formerly American Association of Blood Banks) developed these guidelines from a recent systematic review on platelet transfusion.
Synopsis: One strong recommendation was made based on moderate-quality evidence. Four weak or uncertain recommendations were made based on low- or very low-quality evidence.
For hospitalized patients with therapy-induced hypoproliferative thrombocytopenia, transfusion of up to a single unit of platelets is recommended for a platelet count of 10x109 cells/L or less to reduce the risk of spontaneous bleeding (strong recommendation, moderate-quality evidence).
For patients undergoing elective central venous catheter placement, platelet transfusion is recommended for a platelet count of less than 20x109 cells/L (weak recommendation, low-quality evidence).
For patients undergoing elective diagnostic lumbar puncture, platelet transfusion is recommended for a platelet count of less than 50x109 cells/L (weak recommendation, very low-quality evidence).
For patients undergoing major elective non-neuraxial surgery, platelet transfusion is recommended for a platelet count of less than 50x109 cells/L (weak recommendation, very low-quality evidence).
For patients undergoing cardiopulmonary bypass surgery, it is recommended that surgeons not perform routine transfusion of platelets in non-thrombocytopenic patients. For patients who have peri-operative bleeding with thrombocytopenia and/or evidence of platelet dysfunction, platelet transfusion is recommended (weak recommendation, very low-quality evidence).
There is insufficient evidence to recommend for or against platelet transfusion in patients with intracranial hemorrhage who are taking antiplatelet medications (uncertain recommendation, very low-quality evidence).
Clinical question: What is the recommended approach to platelet transfusion in several common clinical scenarios?
Background: The AABB (formerly American Association of Blood Banks) developed these guidelines from a recent systematic review on platelet transfusion.
Synopsis: One strong recommendation was made based on moderate-quality evidence. Four weak or uncertain recommendations were made based on low- or very low-quality evidence.
For hospitalized patients with therapy-induced hypoproliferative thrombocytopenia, transfusion of up to a single unit of platelets is recommended for a platelet count of 10x109 cells/L or less to reduce the risk of spontaneous bleeding (strong recommendation, moderate-quality evidence).
For patients undergoing elective central venous catheter placement, platelet transfusion is recommended for a platelet count of less than 20x109 cells/L (weak recommendation, low-quality evidence).
For patients undergoing elective diagnostic lumbar puncture, platelet transfusion is recommended for a platelet count of less than 50x109 cells/L (weak recommendation, very low-quality evidence).
For patients undergoing major elective non-neuraxial surgery, platelet transfusion is recommended for a platelet count of less than 50x109 cells/L (weak recommendation, very low-quality evidence).
For patients undergoing cardiopulmonary bypass surgery, it is recommended that surgeons not perform routine transfusion of platelets in non-thrombocytopenic patients. For patients who have peri-operative bleeding with thrombocytopenia and/or evidence of platelet dysfunction, platelet transfusion is recommended (weak recommendation, very low-quality evidence).
There is insufficient evidence to recommend for or against platelet transfusion in patients with intracranial hemorrhage who are taking antiplatelet medications (uncertain recommendation, very low-quality evidence).
Clinical question: What is the recommended approach to platelet transfusion in several common clinical scenarios?
Background: The AABB (formerly American Association of Blood Banks) developed these guidelines from a recent systematic review on platelet transfusion.
Synopsis: One strong recommendation was made based on moderate-quality evidence. Four weak or uncertain recommendations were made based on low- or very low-quality evidence.
For hospitalized patients with therapy-induced hypoproliferative thrombocytopenia, transfusion of up to a single unit of platelets is recommended for a platelet count of 10x109 cells/L or less to reduce the risk of spontaneous bleeding (strong recommendation, moderate-quality evidence).
For patients undergoing elective central venous catheter placement, platelet transfusion is recommended for a platelet count of less than 20x109 cells/L (weak recommendation, low-quality evidence).
For patients undergoing elective diagnostic lumbar puncture, platelet transfusion is recommended for a platelet count of less than 50x109 cells/L (weak recommendation, very low-quality evidence).
For patients undergoing major elective non-neuraxial surgery, platelet transfusion is recommended for a platelet count of less than 50x109 cells/L (weak recommendation, very low-quality evidence).
For patients undergoing cardiopulmonary bypass surgery, it is recommended that surgeons not perform routine transfusion of platelets in non-thrombocytopenic patients. For patients who have peri-operative bleeding with thrombocytopenia and/or evidence of platelet dysfunction, platelet transfusion is recommended (weak recommendation, very low-quality evidence).
There is insufficient evidence to recommend for or against platelet transfusion in patients with intracranial hemorrhage who are taking antiplatelet medications (uncertain recommendation, very low-quality evidence).
Shorter Treatment for Vertebral Osteomyelitis May Be as Effective as Longer Treatment
Clinical question: Is a six-week regimen of antibiotics as effective as a 12-week regimen in the treatment of vertebral osteomyelitis?
Background: The optimal duration of antibiotic treatment for vertebral osteomyelitis is unknown. Previous guidelines recommending six to 12 weeks of therapy have been based on expert opinion rather than clinical trial data.
Study design: Multi-center, open-label, randomized controlled trial.
Setting: Seventy-one medical care centers in France.
Synopsis: Three hundred fifty-six adult patients with culture-proven bacterial vertebral osteomyelitis were randomized to six- or 12- week antibiotic treatment regimens. The primary outcome was confirmed cure of infection at 12 months, as defined by absence of pain, fever, and CRP <10 mg/L. Outcomes were determined by a blinded panel of physicians.
Results showed 90.9% of the patients in the six-week group, and 90.8% of the patients in the 12-week group, met criteria for clinical cure. The lower bound of the 95% confidence interval for the difference in percentages of cure between groups was -6.2%, satisfying the predetermined noninferiority margin of 10%.
Antibiotic therapy in this trial was governed by French guidelines, which recommend oral fluoroquinolones and rifampin as first-line agents for vertebral osteomyelitis. Median duration of IV antibiotic therapy was less than 14 days. Relatively few patients had abscesses, and only eight of the 145 patients with Staphylococcus aureus (SA) infections had methicillin-resistant SA (MRSA).
Bottom line: A six-week regimen of antibiotics was shown to be noninferior to a 12-week regimen for treatment of vertebral osteomyelitis. Treatment for longer than six weeks may be indicated in the setting of drug-resistant organisms, extensive bone destruction, or abscesses.
Clinical question: Is a six-week regimen of antibiotics as effective as a 12-week regimen in the treatment of vertebral osteomyelitis?
Background: The optimal duration of antibiotic treatment for vertebral osteomyelitis is unknown. Previous guidelines recommending six to 12 weeks of therapy have been based on expert opinion rather than clinical trial data.
Study design: Multi-center, open-label, randomized controlled trial.
Setting: Seventy-one medical care centers in France.
Synopsis: Three hundred fifty-six adult patients with culture-proven bacterial vertebral osteomyelitis were randomized to six- or 12- week antibiotic treatment regimens. The primary outcome was confirmed cure of infection at 12 months, as defined by absence of pain, fever, and CRP <10 mg/L. Outcomes were determined by a blinded panel of physicians.
Results showed 90.9% of the patients in the six-week group, and 90.8% of the patients in the 12-week group, met criteria for clinical cure. The lower bound of the 95% confidence interval for the difference in percentages of cure between groups was -6.2%, satisfying the predetermined noninferiority margin of 10%.
Antibiotic therapy in this trial was governed by French guidelines, which recommend oral fluoroquinolones and rifampin as first-line agents for vertebral osteomyelitis. Median duration of IV antibiotic therapy was less than 14 days. Relatively few patients had abscesses, and only eight of the 145 patients with Staphylococcus aureus (SA) infections had methicillin-resistant SA (MRSA).
Bottom line: A six-week regimen of antibiotics was shown to be noninferior to a 12-week regimen for treatment of vertebral osteomyelitis. Treatment for longer than six weeks may be indicated in the setting of drug-resistant organisms, extensive bone destruction, or abscesses.
Clinical question: Is a six-week regimen of antibiotics as effective as a 12-week regimen in the treatment of vertebral osteomyelitis?
Background: The optimal duration of antibiotic treatment for vertebral osteomyelitis is unknown. Previous guidelines recommending six to 12 weeks of therapy have been based on expert opinion rather than clinical trial data.
Study design: Multi-center, open-label, randomized controlled trial.
Setting: Seventy-one medical care centers in France.
Synopsis: Three hundred fifty-six adult patients with culture-proven bacterial vertebral osteomyelitis were randomized to six- or 12- week antibiotic treatment regimens. The primary outcome was confirmed cure of infection at 12 months, as defined by absence of pain, fever, and CRP <10 mg/L. Outcomes were determined by a blinded panel of physicians.
Results showed 90.9% of the patients in the six-week group, and 90.8% of the patients in the 12-week group, met criteria for clinical cure. The lower bound of the 95% confidence interval for the difference in percentages of cure between groups was -6.2%, satisfying the predetermined noninferiority margin of 10%.
Antibiotic therapy in this trial was governed by French guidelines, which recommend oral fluoroquinolones and rifampin as first-line agents for vertebral osteomyelitis. Median duration of IV antibiotic therapy was less than 14 days. Relatively few patients had abscesses, and only eight of the 145 patients with Staphylococcus aureus (SA) infections had methicillin-resistant SA (MRSA).
Bottom line: A six-week regimen of antibiotics was shown to be noninferior to a 12-week regimen for treatment of vertebral osteomyelitis. Treatment for longer than six weeks may be indicated in the setting of drug-resistant organisms, extensive bone destruction, or abscesses.
ITL: Physician Reviews of HM-Relevant Research
Clinical question: Is the risk of recurrence of Clostridium difficile infection (CDI) increased by the use of “non-CDI” antimicrobial agents (inactive against C. diff) during or after CDI therapy?
Background: Recurrence of CDI is expected to increase with use of non-CDI antimicrobials. Previous studies have not distinguished between the timing of non-CDI agents during and after CDI treatment, nor examined the effect of frequency, duration, or type of non-CDI antibiotic therapy.
Study design: Retrospective cohort.
Setting: Academic Veterans Affairs medical center.
Synopsis: All patients with CDI over a three-year period were evaluated to determine the association between non-CDI antimicrobial during or within 30 days following CDI therapy and 90-day CDI recurrence. Of 246 patients, 57% received concurrent or subsequent non-CDI antimicrobials. CDI recurred in 40% of patients who received non-CDI antimicrobials and in 16% of those who did not (OR: 3.5, 95% CI: 1.9 to 6.5).
After multivariable adjustment (including age, duration of CDI treatment, comorbidity, hospital and ICU admission, and gastric acid suppression), those who received non-CDI antimicrobials during CDI therapy had no increased risk of recurrence. However, those who received any non-CDI antimicrobials after initial CDI treatment had an absolute recurrence rate of 48% with an adjusted OR of 3.02 (95% CI: 1.65 to 5.52). This increased risk of recurrence was unaffected by the number or duration of non-CDI antimicrobial prescriptions. Subgroup analysis by antimicrobial class revealed statistically significant associations only with beta-lactams and fluoroquinolones.
Bottom line: The risk of recurrence of CDI is tripled by exposure to non-CDI antimicrobials within 30 days after CDI treatment, irrespective of the number or duration of such exposures.
Citation: Drekonja DM, Amundson WH, DeCarolis DD, Kuskowski MA, Lederle FA, Johnson JR. Antimicrobial use and risk for recurrent Clostridium difficile infection. Am J Med. 2011;124:1081.e1-1081.e7.
Clinical question: Is the risk of recurrence of Clostridium difficile infection (CDI) increased by the use of “non-CDI” antimicrobial agents (inactive against C. diff) during or after CDI therapy?
Background: Recurrence of CDI is expected to increase with use of non-CDI antimicrobials. Previous studies have not distinguished between the timing of non-CDI agents during and after CDI treatment, nor examined the effect of frequency, duration, or type of non-CDI antibiotic therapy.
Study design: Retrospective cohort.
Setting: Academic Veterans Affairs medical center.
Synopsis: All patients with CDI over a three-year period were evaluated to determine the association between non-CDI antimicrobial during or within 30 days following CDI therapy and 90-day CDI recurrence. Of 246 patients, 57% received concurrent or subsequent non-CDI antimicrobials. CDI recurred in 40% of patients who received non-CDI antimicrobials and in 16% of those who did not (OR: 3.5, 95% CI: 1.9 to 6.5).
After multivariable adjustment (including age, duration of CDI treatment, comorbidity, hospital and ICU admission, and gastric acid suppression), those who received non-CDI antimicrobials during CDI therapy had no increased risk of recurrence. However, those who received any non-CDI antimicrobials after initial CDI treatment had an absolute recurrence rate of 48% with an adjusted OR of 3.02 (95% CI: 1.65 to 5.52). This increased risk of recurrence was unaffected by the number or duration of non-CDI antimicrobial prescriptions. Subgroup analysis by antimicrobial class revealed statistically significant associations only with beta-lactams and fluoroquinolones.
Bottom line: The risk of recurrence of CDI is tripled by exposure to non-CDI antimicrobials within 30 days after CDI treatment, irrespective of the number or duration of such exposures.
Citation: Drekonja DM, Amundson WH, DeCarolis DD, Kuskowski MA, Lederle FA, Johnson JR. Antimicrobial use and risk for recurrent Clostridium difficile infection. Am J Med. 2011;124:1081.e1-1081.e7.
Clinical question: Is the risk of recurrence of Clostridium difficile infection (CDI) increased by the use of “non-CDI” antimicrobial agents (inactive against C. diff) during or after CDI therapy?
Background: Recurrence of CDI is expected to increase with use of non-CDI antimicrobials. Previous studies have not distinguished between the timing of non-CDI agents during and after CDI treatment, nor examined the effect of frequency, duration, or type of non-CDI antibiotic therapy.
Study design: Retrospective cohort.
Setting: Academic Veterans Affairs medical center.
Synopsis: All patients with CDI over a three-year period were evaluated to determine the association between non-CDI antimicrobial during or within 30 days following CDI therapy and 90-day CDI recurrence. Of 246 patients, 57% received concurrent or subsequent non-CDI antimicrobials. CDI recurred in 40% of patients who received non-CDI antimicrobials and in 16% of those who did not (OR: 3.5, 95% CI: 1.9 to 6.5).
After multivariable adjustment (including age, duration of CDI treatment, comorbidity, hospital and ICU admission, and gastric acid suppression), those who received non-CDI antimicrobials during CDI therapy had no increased risk of recurrence. However, those who received any non-CDI antimicrobials after initial CDI treatment had an absolute recurrence rate of 48% with an adjusted OR of 3.02 (95% CI: 1.65 to 5.52). This increased risk of recurrence was unaffected by the number or duration of non-CDI antimicrobial prescriptions. Subgroup analysis by antimicrobial class revealed statistically significant associations only with beta-lactams and fluoroquinolones.
Bottom line: The risk of recurrence of CDI is tripled by exposure to non-CDI antimicrobials within 30 days after CDI treatment, irrespective of the number or duration of such exposures.
Citation: Drekonja DM, Amundson WH, DeCarolis DD, Kuskowski MA, Lederle FA, Johnson JR. Antimicrobial use and risk for recurrent Clostridium difficile infection. Am J Med. 2011;124:1081.e1-1081.e7.
ITL: Physician Reviews of HM-Relevant Research
Clinical question: Is it safe to perform esophagogastroduodenoscopy (EGD) in patients with upper gastrointestinal (GI) hemorrhage and low hematocrit?
Background: Patients admitted with GI hemorrhage are generally volume-resuscitated aggressively upon admission. After hemodynamic stability has been achieved, some would advocate delaying EGD until the hemoglobin and hematocrit are above 10 g/dL and 30%, respectively. This study attempted to determine whether EGD is safe in the setting of low hematocrit levels.
Study design: Prospective cohort.
Setting: Parkland Memorial Hospital, Dallas.
Synopsis: The 920 patients with upper GI bleeding were divided into two groups: a low (<30%) hematocrit group and a high (>30%) hematocrit group. They were analyzed for differences in rates of cardiovascular events, requirement for surgery, angiography, mortality, or ICU transfer. Overall event rates were extremely low, with no differences between the two groups.
Bottom line: Transfusing to a target hematocrit of >30% should not be a prerequisite for EGD in patients who present with upper GI bleeding.
Citation: Balderas V, Bhore R, Lara LF, Spesivtseva J, Rockey DC. The hematocrit level in upper gastrointestinal hemorrhage: safety of endoscopy and outcomes. Am J Med. 2011;124:970-976.
Clinical question: Is it safe to perform esophagogastroduodenoscopy (EGD) in patients with upper gastrointestinal (GI) hemorrhage and low hematocrit?
Background: Patients admitted with GI hemorrhage are generally volume-resuscitated aggressively upon admission. After hemodynamic stability has been achieved, some would advocate delaying EGD until the hemoglobin and hematocrit are above 10 g/dL and 30%, respectively. This study attempted to determine whether EGD is safe in the setting of low hematocrit levels.
Study design: Prospective cohort.
Setting: Parkland Memorial Hospital, Dallas.
Synopsis: The 920 patients with upper GI bleeding were divided into two groups: a low (<30%) hematocrit group and a high (>30%) hematocrit group. They were analyzed for differences in rates of cardiovascular events, requirement for surgery, angiography, mortality, or ICU transfer. Overall event rates were extremely low, with no differences between the two groups.
Bottom line: Transfusing to a target hematocrit of >30% should not be a prerequisite for EGD in patients who present with upper GI bleeding.
Citation: Balderas V, Bhore R, Lara LF, Spesivtseva J, Rockey DC. The hematocrit level in upper gastrointestinal hemorrhage: safety of endoscopy and outcomes. Am J Med. 2011;124:970-976.
Clinical question: Is it safe to perform esophagogastroduodenoscopy (EGD) in patients with upper gastrointestinal (GI) hemorrhage and low hematocrit?
Background: Patients admitted with GI hemorrhage are generally volume-resuscitated aggressively upon admission. After hemodynamic stability has been achieved, some would advocate delaying EGD until the hemoglobin and hematocrit are above 10 g/dL and 30%, respectively. This study attempted to determine whether EGD is safe in the setting of low hematocrit levels.
Study design: Prospective cohort.
Setting: Parkland Memorial Hospital, Dallas.
Synopsis: The 920 patients with upper GI bleeding were divided into two groups: a low (<30%) hematocrit group and a high (>30%) hematocrit group. They were analyzed for differences in rates of cardiovascular events, requirement for surgery, angiography, mortality, or ICU transfer. Overall event rates were extremely low, with no differences between the two groups.
Bottom line: Transfusing to a target hematocrit of >30% should not be a prerequisite for EGD in patients who present with upper GI bleeding.
Citation: Balderas V, Bhore R, Lara LF, Spesivtseva J, Rockey DC. The hematocrit level in upper gastrointestinal hemorrhage: safety of endoscopy and outcomes. Am J Med. 2011;124:970-976.
In the Literature: Physician Reviews of HM-Related Research
In This Edition
Literature At A Glance
A guide to this month’s studies
- IDSA/ATS guidelines for community-acquired pneumonia
- Improved asthma with IL-13 antibody
- Rivaroxaban vs. warfarin for stroke prevention in atrial fibrillation
- Apixaban vs. warfarin for stroke prevention in atrial fibrillation
- Ultrasonography more sensitive than chest radiograph for pneumothorax
- Current readmission risk models inadequate
- Optimal fluid volume for acute pancreatitis
- Low mortality in saddle pulmonary embolism
Triage Decisions for Patients with Severe Community-Acquired Pneumonia Should Be Based on IDSA/ATS Guidelines, Not Inflammatory Biomarkers
Clinical question: Can C-reactive protein levels (CRP), procalcitonin, TNF-alpha, and cytokine levels predict the need for intensive-care admission more accurately than IDSA/ATS guidelines in patients with severe community-acquired pneumonia (CAP)?
Background: Inflammatory biomarkers, such as CRP and procalcitonin, have diagnostic and prognostic utility in patients with CAP. Whether these inflammatory biomarkers can help triage patients to the appropriate level of care is unknown.
Study design: Prospective case control study.
Setting: Two university hospitals in Spain.
Synopsis: The study included 685 patients with severe CAP who did not require mechanical ventilation or vasopressor support. Serum levels of CRP, procalcitonin, TNF-alpha, IL-1, IL-6, IL-8, and IL-10, as well as Infectious Diseases Society of American/American Thoracic Society (IDSA/ATS) minor severity criteria data, were collected on admission. After controlling for age, comorbidities, and PSI risk class, serum levels of CRP and procalcitonin were found to be significantly higher in ICU patients compared with non-ICU patients. Despite this, these inflammatory biomarkers did not augment the IDSA/ATS guidelines, suggesting that patients who have three or more minor criteria be considered for ICU admission.
The study did suggest that patients with severe CAP and low levels of IL-6 and procalcitonin could potentially be managed safely outside of the ICU. However, hospitalists should be wary of applying the study results due to the small number of ICU patients in this study and the lack of real-time availability of these biomarkers at most institutions.
Bottom line: More studies of inflammatory biomarkers are needed before using them to determine the level of care required for patients with CAP. Until these data are available, physicians should use the IDSA/ATS guidelines to triage patients to the appropriate level of care.
Citation: Ramirez P, Ferrer M, Torres A, et al. Inflammatory biomarkers and prediction for intensive care unit admission pneumonia. Crit Care Med. 2011;39:2211-2217.
IL-13 Antibody Lebrikizumab Shows Promise as a New Therapy for Adults with Uncontrolled Asthma
Clinical question: Can lebrikizumab, an IL-13 antibody, improve asthma control in patients with uncontrolled asthma?
Background: Asthma is a complex disease, with varied patient response to treatment. Some patients have uncontrolled asthma despite inhaled glucocorticoids. It is postulated that IL-13 may account for this variability and that some patients with uncontrolled asthma are poorly controlled due to glucocorticoid resistance mediated by IL-13. Lebrikizumab is an IgG4 monoclonal antibody that binds to and inhibits the function of IL-13. This study was performed to see if this antibody would be effective in patients with uncontrolled asthma despite inhaled glucocorticoid therapy.
Study design: Randomized double-blinded placebo-controlled trial.
Setting: Multiple centers.
Synopsis: The study randomized 219 adult asthma patients who were inadequately controlled despite inhaled corticosteroids to a placebo or lebrikizumab. The primary outcome was improvement in prebronchodilator FEV1 from baseline. Secondary outcomes were exacerbations, use of rescue medications, and symptom scores. Patients were also stratified and analyzed based on surrogate markers for IL-13, which included serum IGE levels, eosinophil counts, and periostin levels.
In patients who were randomized to the lebrikizumab treatment, there was a statistically significant improvement in FEV1 of 5.5%, which occurred almost immediately and was sustained for the entire 32 weeks of the study. The improvement was more significant in patients who had high surrogate markers for IL-13. Despite this improvement in FEV1, there were no differences in secondary outcomes except in patients who had surrogate markers for high IL-13 levels.
Bottom line: In adults with asthma who remained uncontrolled despite inhaled corticosteroid therapy, IL-13 antagonism with lebrikizumab improved FEV1. However, the clinical relevance of these modest improvements remains unclear.
Citation: Corren J, Lemanske R, Matthews J, et al. Lebrikizumab treatment in adults with asthma. N Engl J Med. 2011;365:1088-1098.
Rivaroxaban Is Noninferior to Warfarin for Stroke Prevention in Atrial Fibrillation
Clinical question: How does rivaroxaban compare with warfarin in the prevention of stroke or systemic embolism in patients with nonvalvular atrial fibrillation?
Background: Warfarin is effective for the prevention of stroke in atrial fibrillation, but it requires close monitoring and adjustment. Rivaroxaban, an oral Xa inhibitor, may be safer, easier, and more effective than warfarin.
Study design: Multicenter, randomized, double-blind, double-dummy trial.
Setting: 1,178 sites in 45 countries.
Synopsis: The study included 14,264 patients with nonvalvular atrial fibrillation who were randomized to either fixed-dose rivaroxaban (20 mg daily or 15 mg daily for CrCl 30-49 mL/min) plus placebo or adjusted-dose warfarin (target INR 2.0 to 3.0) plus placebo. The mean CHADS2 score was 3.5. The primary endpoint (stroke or systemic embolism) occurred in 1.7% of patients per year in the rivaroxaban group and 2.2% per year in the warfarin group (hazard ratio for rivaroxaban 0.79; 95% CI: 0.66 to 0.96, P<0.001 for noninferiority). There was no difference in major or nonmajor clinically significant bleeding between the two groups (14.9% rivaroxaban vs. 14.5% warfarin, hazard ratio=1.03, 95% CI: 0.96 to 1.11, P=0.44). There were fewer intracranial hemorrhages (0.5% vs. 0.7%, P=0.02) and fatal bleeding (0.2% vs. 0.5%, P=0.003) in the rivaroxaban group.
Bottom line: In patients with atrial fibrillation, rivaroxaban was noninferior to warfarin for the prevention of stroke or systemic embolization, with a similar risk of major bleeding and a lower risk of intracranial hemorrhage or fatal bleeding.
Citation: Patel MR, Mahaffey K, Garg J, et al. Rivaroxaban versus warfarin in nonvalvular atrial fibrillation. N Engl J Med. 2011;365:883-891.
Apixaban More Effective and Safer than Warfarin for Stroke Prevention in Atrial Fibrillation
Clinical question: How does the effectiveness and safety of apixaban compare with warfarin for stroke prevention in atrial fibrillation?
Background: Until recently, warfarin has been the only available oral anticoagulant for stroke prevention in patients with atrial fibrillation (AF). The oral factor Xa inhibitors have shown similar efficacy and safety, without the monitoring requirement and drug interactions associated with warfarin.
Study design: Prospective randomized double-blind controlled trial.
Setting: More than 1,000 clinical sites in 39 countries.
Synopsis: This study randomized 18,201 patients with atrial fibrillation or flutter and at least one CHADS2 risk factor for stroke to receive oral apixaban or warfarin therapy. Exclusion criteria were prosthetic valves and severe kidney disease. The median duration of follow-up was 1.8 years, and the major endpoints were incidence of stroke, systemic embolism, bleeding complications, and mortality.
Compared with warfarin, apixaban reduced the annual incidence of stroke and systemic embolism from 1.6% to 1.3% (HR 0.79, 95%: CI 0.66 to 0.95, P=0.01 for superiority), and reduced mortality (HR: 0.89, 95% CI: 0.80 to 0.998). For the combined endpoint of stroke, systemic embolism, MI, or death, the annual rate was reduced from 5.5% to 4.9% (HR: 0.88, 95% CI: 0.80 to 0.97). All measures of bleeding were less frequent with apixaban: major 2.1% vs. 3.1% (HR: 0.69, 95% CI: 0.60 to 0.80), and combined major and minor bleeding 4.1% vs. 6.0% (HR: 0.68, 95% CI: 0.61 to 0.75). The annual rate for the net outcome of stroke, embolism, or major bleeding was 3.2% with apixaban and 4.1% with warfarin (HR: 0.77, 95% CI: 0.69 to 0.86).
Bottom line: Compared with warfarin therapy, apixaban is more effective and safer for stroke prevention in patients with atrial fibrillation.
Citation: Granger CB, Alexander JH, McMurray JJ, et al. Apixaban versus warfarin in patients with atrial fibrillation. N Engl J Med. 2011;365:981-992.
Ultrasonography Is Useful in Diagnosis of Pneumothorax
Clinical question: Is transthoracic ultrasonography a useful tool to diagnose pneumothorax?
Background: CT is the diagnostic gold standard for pneumothorax, but it is associated with radiation exposure and requires patient transport. Chest radiograph is easy to perform but may be too insensitive for adequate diagnosis. Ultrasonography’s diagnostic performance for detecting pneumothorax needs further evaluation.
Study design: Systematic review and meta-analysis.
Setting: Critically ill, trauma, or post-biopsy patients were identified in each of the studies.
Synopsis: The meta-analysis of 20 eligible studies found a pooled sensitivity of ultrasound for the detection of pneumothorax of 0.88 (95% CI: 0.85 to 0.91) and specificity of 0.99 (0.98 to 0.99) compared with sensitivity of 0.52 (0.49 to 0.55) and specificity of 1.00 (1.00 to 1.00) for chest radiograph. Although the overall ROC curve was not significantly different between these modalities, the accuracy of ultrasonography was highly dependent on the skill of the operator.
Bottom line: When performed by a skilled operator, transthoracic ultrasonography is as specific, and more sensitive, than chest radiograph in diagnosing pneumothorax.
Citation: Ding W, Shen Y, Yang J, He X, Zhang M. Diagnosis of pneumothorax by radiography and ultrasonography: a meta-analysis. Chest. 2011;140:859-866.
Risk Prediction for Hospital Readmission Remains Challenging
Clinical question: Can readmission risk assessment be used to identify which patients would benefit most from care-transition interventions, or to risk-adjust readmission rates for hospital comparison?
Background: Multiple models to predict hospital readmission have been described and validated. Identifying patients at high risk for readmission could allow for customized care-transition interventions, or could be used to risk-adjust readmission rates to compare publicly reported rates by hospital.
Study design: Systematic review with qualitative synthesis of results.
Setting: Thirty studies (23 from the U.S.) tested 26 unique readmission models.
Synopsis: Each model had been tested in both a derivation and validation cohort. Fourteen models (nine from the U.S.), using retrospective administrative data to compare risk-adjusted rates between hospitals, had poor discriminative capacity (c statistic range: 0.55 to 0.65). Seven models could be used to identify high-risk patients early in the hospitalization (c statistic range: 0.56 to 0.72) and five could be used to identify high-risk patients at discharge (c statistic range: 0.68 to 0.83), but these also had poor to moderate discriminative capacity. Multiple variables were considered in each of the models; most incorporated medical comorbidities and prior use of healthcare services.
Bottom line: Current readmission risk prediction models do not perform adequately for comparative or clinical purposes.
Citation: Kansagara D, Englander H, Salanitro A, et. al. Risk prediction models for hospital readmission: a systematic review. JAMA. 2011;306:1688-1698.
Intravenous Fluids for Acute Pancreatitis: More May Be Less
Clinical question: What is the optimal volume of fluid administration for treatment of acute pancreatitis?
Background: Current guidelines for management of acute pancreatitis emphasize vigorous administration of intravenous fluids to reduce the risk of pancreatic necrosis and organ failure. This recommendation is based upon animal studies, and has not been subjected to clinical evaluation in humans.
Study design: Prospective observational cohort.
Setting: University-affiliated tertiary-care public hospital in Spain.
Synopsis: This study enrolled 247 patients admitted with acute pancreatitis to determine the association between the volume of fluid administered during the first 24 hours and the development of persistent organ failure, pancreatic fluid collection or necrosis, and mortality. The volume and rate of fluid administered were determined by the treating physician. Patients were classified into three groups: those receiving a volume <3.1 L, 3.1 to 4.1 L, or >4.1 L.
After multivariate adjustment, those receiving <3.1 L had no increased risk of necrosis or any other adverse outcome, compared with those who received the middle range of fluid volume.
Patients receiving >4.1 L had a higher risk of persistent organ failure (OR: 7.7, 95% CI: 1.5 to 38.7), particularly renal and respiratory insufficiency, and fluid collection development (OR: 1.9, 95% CI: 1 to 3.7) independent of disease severity. Pancreatic necrosis and mortality were similar in the three groups.
Bottom line: Administration of large-volume intravenous fluids (>4.1 L) in
the first 24 hours was associated with worse outcomes, although residual confounding cannot be excluded in this nonrandomized study.
Citation: de-Madaria E, Soler-Sala G, Sanchez-Paya J, et al. Influence of fluid therapy on the prognosis of acute pancreatitis: a prospective cohort study. Am J Gastroenterol. 2011;106:1843-1850.
Clinical Outcomes in Saddle Pulmonary Embolism
Clinical question: What are the treatments used and outcomes associated with saddle pulmonary embolism?
Background: Saddle pulmonary embolism is a risk for right ventricular dysfunction and sudden hemodynamic collapse. There are limited data on the clinical presentation and outcomes in these patients.
Study design: Retrospective case review.
Setting: Single academic medical center.
Synopsis: In this retrospective review of 680 patients diagnosed with pulmonary embolism on CT at a single academic medical center from 2004 to 2009, 5.4% (37 patients) had a saddle pulmonary embolism.
Most patients with saddle pulmonary embolism were hemodynamically stable and responded to standard therapy with unfractionated heparin. The mean length of stay was nine days, 46% received an inferior vena cava filter, 41% were treated in an ICU, and 5.4% (two patients) died in the hospital. Thrombolytics were used in only 11% of patients, most of which had sustained hypotension and/or were mechanically ventilated.
Bottom line: Most patients with saddle pulmonary embolus in this single institution study did not receive thrombolytics and had overall low mortality.
Citation: Sardi A, Gluskin J, Guttentag A, Kotler MN, Braitman LE, Lippmann M. Saddle pulmonary embolism: is it as bad as it looks? A community hospital experience. Crit Care Med. 2011;39:2413-2418.
In This Edition
Literature At A Glance
A guide to this month’s studies
- IDSA/ATS guidelines for community-acquired pneumonia
- Improved asthma with IL-13 antibody
- Rivaroxaban vs. warfarin for stroke prevention in atrial fibrillation
- Apixaban vs. warfarin for stroke prevention in atrial fibrillation
- Ultrasonography more sensitive than chest radiograph for pneumothorax
- Current readmission risk models inadequate
- Optimal fluid volume for acute pancreatitis
- Low mortality in saddle pulmonary embolism
Triage Decisions for Patients with Severe Community-Acquired Pneumonia Should Be Based on IDSA/ATS Guidelines, Not Inflammatory Biomarkers
Clinical question: Can C-reactive protein levels (CRP), procalcitonin, TNF-alpha, and cytokine levels predict the need for intensive-care admission more accurately than IDSA/ATS guidelines in patients with severe community-acquired pneumonia (CAP)?
Background: Inflammatory biomarkers, such as CRP and procalcitonin, have diagnostic and prognostic utility in patients with CAP. Whether these inflammatory biomarkers can help triage patients to the appropriate level of care is unknown.
Study design: Prospective case control study.
Setting: Two university hospitals in Spain.
Synopsis: The study included 685 patients with severe CAP who did not require mechanical ventilation or vasopressor support. Serum levels of CRP, procalcitonin, TNF-alpha, IL-1, IL-6, IL-8, and IL-10, as well as Infectious Diseases Society of American/American Thoracic Society (IDSA/ATS) minor severity criteria data, were collected on admission. After controlling for age, comorbidities, and PSI risk class, serum levels of CRP and procalcitonin were found to be significantly higher in ICU patients compared with non-ICU patients. Despite this, these inflammatory biomarkers did not augment the IDSA/ATS guidelines, suggesting that patients who have three or more minor criteria be considered for ICU admission.
The study did suggest that patients with severe CAP and low levels of IL-6 and procalcitonin could potentially be managed safely outside of the ICU. However, hospitalists should be wary of applying the study results due to the small number of ICU patients in this study and the lack of real-time availability of these biomarkers at most institutions.
Bottom line: More studies of inflammatory biomarkers are needed before using them to determine the level of care required for patients with CAP. Until these data are available, physicians should use the IDSA/ATS guidelines to triage patients to the appropriate level of care.
Citation: Ramirez P, Ferrer M, Torres A, et al. Inflammatory biomarkers and prediction for intensive care unit admission pneumonia. Crit Care Med. 2011;39:2211-2217.
IL-13 Antibody Lebrikizumab Shows Promise as a New Therapy for Adults with Uncontrolled Asthma
Clinical question: Can lebrikizumab, an IL-13 antibody, improve asthma control in patients with uncontrolled asthma?
Background: Asthma is a complex disease, with varied patient response to treatment. Some patients have uncontrolled asthma despite inhaled glucocorticoids. It is postulated that IL-13 may account for this variability and that some patients with uncontrolled asthma are poorly controlled due to glucocorticoid resistance mediated by IL-13. Lebrikizumab is an IgG4 monoclonal antibody that binds to and inhibits the function of IL-13. This study was performed to see if this antibody would be effective in patients with uncontrolled asthma despite inhaled glucocorticoid therapy.
Study design: Randomized double-blinded placebo-controlled trial.
Setting: Multiple centers.
Synopsis: The study randomized 219 adult asthma patients who were inadequately controlled despite inhaled corticosteroids to a placebo or lebrikizumab. The primary outcome was improvement in prebronchodilator FEV1 from baseline. Secondary outcomes were exacerbations, use of rescue medications, and symptom scores. Patients were also stratified and analyzed based on surrogate markers for IL-13, which included serum IGE levels, eosinophil counts, and periostin levels.
In patients who were randomized to the lebrikizumab treatment, there was a statistically significant improvement in FEV1 of 5.5%, which occurred almost immediately and was sustained for the entire 32 weeks of the study. The improvement was more significant in patients who had high surrogate markers for IL-13. Despite this improvement in FEV1, there were no differences in secondary outcomes except in patients who had surrogate markers for high IL-13 levels.
Bottom line: In adults with asthma who remained uncontrolled despite inhaled corticosteroid therapy, IL-13 antagonism with lebrikizumab improved FEV1. However, the clinical relevance of these modest improvements remains unclear.
Citation: Corren J, Lemanske R, Matthews J, et al. Lebrikizumab treatment in adults with asthma. N Engl J Med. 2011;365:1088-1098.
Rivaroxaban Is Noninferior to Warfarin for Stroke Prevention in Atrial Fibrillation
Clinical question: How does rivaroxaban compare with warfarin in the prevention of stroke or systemic embolism in patients with nonvalvular atrial fibrillation?
Background: Warfarin is effective for the prevention of stroke in atrial fibrillation, but it requires close monitoring and adjustment. Rivaroxaban, an oral Xa inhibitor, may be safer, easier, and more effective than warfarin.
Study design: Multicenter, randomized, double-blind, double-dummy trial.
Setting: 1,178 sites in 45 countries.
Synopsis: The study included 14,264 patients with nonvalvular atrial fibrillation who were randomized to either fixed-dose rivaroxaban (20 mg daily or 15 mg daily for CrCl 30-49 mL/min) plus placebo or adjusted-dose warfarin (target INR 2.0 to 3.0) plus placebo. The mean CHADS2 score was 3.5. The primary endpoint (stroke or systemic embolism) occurred in 1.7% of patients per year in the rivaroxaban group and 2.2% per year in the warfarin group (hazard ratio for rivaroxaban 0.79; 95% CI: 0.66 to 0.96, P<0.001 for noninferiority). There was no difference in major or nonmajor clinically significant bleeding between the two groups (14.9% rivaroxaban vs. 14.5% warfarin, hazard ratio=1.03, 95% CI: 0.96 to 1.11, P=0.44). There were fewer intracranial hemorrhages (0.5% vs. 0.7%, P=0.02) and fatal bleeding (0.2% vs. 0.5%, P=0.003) in the rivaroxaban group.
Bottom line: In patients with atrial fibrillation, rivaroxaban was noninferior to warfarin for the prevention of stroke or systemic embolization, with a similar risk of major bleeding and a lower risk of intracranial hemorrhage or fatal bleeding.
Citation: Patel MR, Mahaffey K, Garg J, et al. Rivaroxaban versus warfarin in nonvalvular atrial fibrillation. N Engl J Med. 2011;365:883-891.
Apixaban More Effective and Safer than Warfarin for Stroke Prevention in Atrial Fibrillation
Clinical question: How does the effectiveness and safety of apixaban compare with warfarin for stroke prevention in atrial fibrillation?
Background: Until recently, warfarin has been the only available oral anticoagulant for stroke prevention in patients with atrial fibrillation (AF). The oral factor Xa inhibitors have shown similar efficacy and safety, without the monitoring requirement and drug interactions associated with warfarin.
Study design: Prospective randomized double-blind controlled trial.
Setting: More than 1,000 clinical sites in 39 countries.
Synopsis: This study randomized 18,201 patients with atrial fibrillation or flutter and at least one CHADS2 risk factor for stroke to receive oral apixaban or warfarin therapy. Exclusion criteria were prosthetic valves and severe kidney disease. The median duration of follow-up was 1.8 years, and the major endpoints were incidence of stroke, systemic embolism, bleeding complications, and mortality.
Compared with warfarin, apixaban reduced the annual incidence of stroke and systemic embolism from 1.6% to 1.3% (HR 0.79, 95%: CI 0.66 to 0.95, P=0.01 for superiority), and reduced mortality (HR: 0.89, 95% CI: 0.80 to 0.998). For the combined endpoint of stroke, systemic embolism, MI, or death, the annual rate was reduced from 5.5% to 4.9% (HR: 0.88, 95% CI: 0.80 to 0.97). All measures of bleeding were less frequent with apixaban: major 2.1% vs. 3.1% (HR: 0.69, 95% CI: 0.60 to 0.80), and combined major and minor bleeding 4.1% vs. 6.0% (HR: 0.68, 95% CI: 0.61 to 0.75). The annual rate for the net outcome of stroke, embolism, or major bleeding was 3.2% with apixaban and 4.1% with warfarin (HR: 0.77, 95% CI: 0.69 to 0.86).
Bottom line: Compared with warfarin therapy, apixaban is more effective and safer for stroke prevention in patients with atrial fibrillation.
Citation: Granger CB, Alexander JH, McMurray JJ, et al. Apixaban versus warfarin in patients with atrial fibrillation. N Engl J Med. 2011;365:981-992.
Ultrasonography Is Useful in Diagnosis of Pneumothorax
Clinical question: Is transthoracic ultrasonography a useful tool to diagnose pneumothorax?
Background: CT is the diagnostic gold standard for pneumothorax, but it is associated with radiation exposure and requires patient transport. Chest radiograph is easy to perform but may be too insensitive for adequate diagnosis. Ultrasonography’s diagnostic performance for detecting pneumothorax needs further evaluation.
Study design: Systematic review and meta-analysis.
Setting: Critically ill, trauma, or post-biopsy patients were identified in each of the studies.
Synopsis: The meta-analysis of 20 eligible studies found a pooled sensitivity of ultrasound for the detection of pneumothorax of 0.88 (95% CI: 0.85 to 0.91) and specificity of 0.99 (0.98 to 0.99) compared with sensitivity of 0.52 (0.49 to 0.55) and specificity of 1.00 (1.00 to 1.00) for chest radiograph. Although the overall ROC curve was not significantly different between these modalities, the accuracy of ultrasonography was highly dependent on the skill of the operator.
Bottom line: When performed by a skilled operator, transthoracic ultrasonography is as specific, and more sensitive, than chest radiograph in diagnosing pneumothorax.
Citation: Ding W, Shen Y, Yang J, He X, Zhang M. Diagnosis of pneumothorax by radiography and ultrasonography: a meta-analysis. Chest. 2011;140:859-866.
Risk Prediction for Hospital Readmission Remains Challenging
Clinical question: Can readmission risk assessment be used to identify which patients would benefit most from care-transition interventions, or to risk-adjust readmission rates for hospital comparison?
Background: Multiple models to predict hospital readmission have been described and validated. Identifying patients at high risk for readmission could allow for customized care-transition interventions, or could be used to risk-adjust readmission rates to compare publicly reported rates by hospital.
Study design: Systematic review with qualitative synthesis of results.
Setting: Thirty studies (23 from the U.S.) tested 26 unique readmission models.
Synopsis: Each model had been tested in both a derivation and validation cohort. Fourteen models (nine from the U.S.), using retrospective administrative data to compare risk-adjusted rates between hospitals, had poor discriminative capacity (c statistic range: 0.55 to 0.65). Seven models could be used to identify high-risk patients early in the hospitalization (c statistic range: 0.56 to 0.72) and five could be used to identify high-risk patients at discharge (c statistic range: 0.68 to 0.83), but these also had poor to moderate discriminative capacity. Multiple variables were considered in each of the models; most incorporated medical comorbidities and prior use of healthcare services.
Bottom line: Current readmission risk prediction models do not perform adequately for comparative or clinical purposes.
Citation: Kansagara D, Englander H, Salanitro A, et. al. Risk prediction models for hospital readmission: a systematic review. JAMA. 2011;306:1688-1698.
Intravenous Fluids for Acute Pancreatitis: More May Be Less
Clinical question: What is the optimal volume of fluid administration for treatment of acute pancreatitis?
Background: Current guidelines for management of acute pancreatitis emphasize vigorous administration of intravenous fluids to reduce the risk of pancreatic necrosis and organ failure. This recommendation is based upon animal studies, and has not been subjected to clinical evaluation in humans.
Study design: Prospective observational cohort.
Setting: University-affiliated tertiary-care public hospital in Spain.
Synopsis: This study enrolled 247 patients admitted with acute pancreatitis to determine the association between the volume of fluid administered during the first 24 hours and the development of persistent organ failure, pancreatic fluid collection or necrosis, and mortality. The volume and rate of fluid administered were determined by the treating physician. Patients were classified into three groups: those receiving a volume <3.1 L, 3.1 to 4.1 L, or >4.1 L.
After multivariate adjustment, those receiving <3.1 L had no increased risk of necrosis or any other adverse outcome, compared with those who received the middle range of fluid volume.
Patients receiving >4.1 L had a higher risk of persistent organ failure (OR: 7.7, 95% CI: 1.5 to 38.7), particularly renal and respiratory insufficiency, and fluid collection development (OR: 1.9, 95% CI: 1 to 3.7) independent of disease severity. Pancreatic necrosis and mortality were similar in the three groups.
Bottom line: Administration of large-volume intravenous fluids (>4.1 L) in
the first 24 hours was associated with worse outcomes, although residual confounding cannot be excluded in this nonrandomized study.
Citation: de-Madaria E, Soler-Sala G, Sanchez-Paya J, et al. Influence of fluid therapy on the prognosis of acute pancreatitis: a prospective cohort study. Am J Gastroenterol. 2011;106:1843-1850.
Clinical Outcomes in Saddle Pulmonary Embolism
Clinical question: What are the treatments used and outcomes associated with saddle pulmonary embolism?
Background: Saddle pulmonary embolism is a risk for right ventricular dysfunction and sudden hemodynamic collapse. There are limited data on the clinical presentation and outcomes in these patients.
Study design: Retrospective case review.
Setting: Single academic medical center.
Synopsis: In this retrospective review of 680 patients diagnosed with pulmonary embolism on CT at a single academic medical center from 2004 to 2009, 5.4% (37 patients) had a saddle pulmonary embolism.
Most patients with saddle pulmonary embolism were hemodynamically stable and responded to standard therapy with unfractionated heparin. The mean length of stay was nine days, 46% received an inferior vena cava filter, 41% were treated in an ICU, and 5.4% (two patients) died in the hospital. Thrombolytics were used in only 11% of patients, most of which had sustained hypotension and/or were mechanically ventilated.
Bottom line: Most patients with saddle pulmonary embolus in this single institution study did not receive thrombolytics and had overall low mortality.
Citation: Sardi A, Gluskin J, Guttentag A, Kotler MN, Braitman LE, Lippmann M. Saddle pulmonary embolism: is it as bad as it looks? A community hospital experience. Crit Care Med. 2011;39:2413-2418.
In This Edition
Literature At A Glance
A guide to this month’s studies
- IDSA/ATS guidelines for community-acquired pneumonia
- Improved asthma with IL-13 antibody
- Rivaroxaban vs. warfarin for stroke prevention in atrial fibrillation
- Apixaban vs. warfarin for stroke prevention in atrial fibrillation
- Ultrasonography more sensitive than chest radiograph for pneumothorax
- Current readmission risk models inadequate
- Optimal fluid volume for acute pancreatitis
- Low mortality in saddle pulmonary embolism
Triage Decisions for Patients with Severe Community-Acquired Pneumonia Should Be Based on IDSA/ATS Guidelines, Not Inflammatory Biomarkers
Clinical question: Can C-reactive protein levels (CRP), procalcitonin, TNF-alpha, and cytokine levels predict the need for intensive-care admission more accurately than IDSA/ATS guidelines in patients with severe community-acquired pneumonia (CAP)?
Background: Inflammatory biomarkers, such as CRP and procalcitonin, have diagnostic and prognostic utility in patients with CAP. Whether these inflammatory biomarkers can help triage patients to the appropriate level of care is unknown.
Study design: Prospective case control study.
Setting: Two university hospitals in Spain.
Synopsis: The study included 685 patients with severe CAP who did not require mechanical ventilation or vasopressor support. Serum levels of CRP, procalcitonin, TNF-alpha, IL-1, IL-6, IL-8, and IL-10, as well as Infectious Diseases Society of American/American Thoracic Society (IDSA/ATS) minor severity criteria data, were collected on admission. After controlling for age, comorbidities, and PSI risk class, serum levels of CRP and procalcitonin were found to be significantly higher in ICU patients compared with non-ICU patients. Despite this, these inflammatory biomarkers did not augment the IDSA/ATS guidelines, suggesting that patients who have three or more minor criteria be considered for ICU admission.
The study did suggest that patients with severe CAP and low levels of IL-6 and procalcitonin could potentially be managed safely outside of the ICU. However, hospitalists should be wary of applying the study results due to the small number of ICU patients in this study and the lack of real-time availability of these biomarkers at most institutions.
Bottom line: More studies of inflammatory biomarkers are needed before using them to determine the level of care required for patients with CAP. Until these data are available, physicians should use the IDSA/ATS guidelines to triage patients to the appropriate level of care.
Citation: Ramirez P, Ferrer M, Torres A, et al. Inflammatory biomarkers and prediction for intensive care unit admission pneumonia. Crit Care Med. 2011;39:2211-2217.
IL-13 Antibody Lebrikizumab Shows Promise as a New Therapy for Adults with Uncontrolled Asthma
Clinical question: Can lebrikizumab, an IL-13 antibody, improve asthma control in patients with uncontrolled asthma?
Background: Asthma is a complex disease, with varied patient response to treatment. Some patients have uncontrolled asthma despite inhaled glucocorticoids. It is postulated that IL-13 may account for this variability and that some patients with uncontrolled asthma are poorly controlled due to glucocorticoid resistance mediated by IL-13. Lebrikizumab is an IgG4 monoclonal antibody that binds to and inhibits the function of IL-13. This study was performed to see if this antibody would be effective in patients with uncontrolled asthma despite inhaled glucocorticoid therapy.
Study design: Randomized double-blinded placebo-controlled trial.
Setting: Multiple centers.
Synopsis: The study randomized 219 adult asthma patients who were inadequately controlled despite inhaled corticosteroids to a placebo or lebrikizumab. The primary outcome was improvement in prebronchodilator FEV1 from baseline. Secondary outcomes were exacerbations, use of rescue medications, and symptom scores. Patients were also stratified and analyzed based on surrogate markers for IL-13, which included serum IGE levels, eosinophil counts, and periostin levels.
In patients who were randomized to the lebrikizumab treatment, there was a statistically significant improvement in FEV1 of 5.5%, which occurred almost immediately and was sustained for the entire 32 weeks of the study. The improvement was more significant in patients who had high surrogate markers for IL-13. Despite this improvement in FEV1, there were no differences in secondary outcomes except in patients who had surrogate markers for high IL-13 levels.
Bottom line: In adults with asthma who remained uncontrolled despite inhaled corticosteroid therapy, IL-13 antagonism with lebrikizumab improved FEV1. However, the clinical relevance of these modest improvements remains unclear.
Citation: Corren J, Lemanske R, Matthews J, et al. Lebrikizumab treatment in adults with asthma. N Engl J Med. 2011;365:1088-1098.
Rivaroxaban Is Noninferior to Warfarin for Stroke Prevention in Atrial Fibrillation
Clinical question: How does rivaroxaban compare with warfarin in the prevention of stroke or systemic embolism in patients with nonvalvular atrial fibrillation?
Background: Warfarin is effective for the prevention of stroke in atrial fibrillation, but it requires close monitoring and adjustment. Rivaroxaban, an oral Xa inhibitor, may be safer, easier, and more effective than warfarin.
Study design: Multicenter, randomized, double-blind, double-dummy trial.
Setting: 1,178 sites in 45 countries.
Synopsis: The study included 14,264 patients with nonvalvular atrial fibrillation who were randomized to either fixed-dose rivaroxaban (20 mg daily or 15 mg daily for CrCl 30-49 mL/min) plus placebo or adjusted-dose warfarin (target INR 2.0 to 3.0) plus placebo. The mean CHADS2 score was 3.5. The primary endpoint (stroke or systemic embolism) occurred in 1.7% of patients per year in the rivaroxaban group and 2.2% per year in the warfarin group (hazard ratio for rivaroxaban 0.79; 95% CI: 0.66 to 0.96, P<0.001 for noninferiority). There was no difference in major or nonmajor clinically significant bleeding between the two groups (14.9% rivaroxaban vs. 14.5% warfarin, hazard ratio=1.03, 95% CI: 0.96 to 1.11, P=0.44). There were fewer intracranial hemorrhages (0.5% vs. 0.7%, P=0.02) and fatal bleeding (0.2% vs. 0.5%, P=0.003) in the rivaroxaban group.
Bottom line: In patients with atrial fibrillation, rivaroxaban was noninferior to warfarin for the prevention of stroke or systemic embolization, with a similar risk of major bleeding and a lower risk of intracranial hemorrhage or fatal bleeding.
Citation: Patel MR, Mahaffey K, Garg J, et al. Rivaroxaban versus warfarin in nonvalvular atrial fibrillation. N Engl J Med. 2011;365:883-891.
Apixaban More Effective and Safer than Warfarin for Stroke Prevention in Atrial Fibrillation
Clinical question: How does the effectiveness and safety of apixaban compare with warfarin for stroke prevention in atrial fibrillation?
Background: Until recently, warfarin has been the only available oral anticoagulant for stroke prevention in patients with atrial fibrillation (AF). The oral factor Xa inhibitors have shown similar efficacy and safety, without the monitoring requirement and drug interactions associated with warfarin.
Study design: Prospective randomized double-blind controlled trial.
Setting: More than 1,000 clinical sites in 39 countries.
Synopsis: This study randomized 18,201 patients with atrial fibrillation or flutter and at least one CHADS2 risk factor for stroke to receive oral apixaban or warfarin therapy. Exclusion criteria were prosthetic valves and severe kidney disease. The median duration of follow-up was 1.8 years, and the major endpoints were incidence of stroke, systemic embolism, bleeding complications, and mortality.
Compared with warfarin, apixaban reduced the annual incidence of stroke and systemic embolism from 1.6% to 1.3% (HR 0.79, 95%: CI 0.66 to 0.95, P=0.01 for superiority), and reduced mortality (HR: 0.89, 95% CI: 0.80 to 0.998). For the combined endpoint of stroke, systemic embolism, MI, or death, the annual rate was reduced from 5.5% to 4.9% (HR: 0.88, 95% CI: 0.80 to 0.97). All measures of bleeding were less frequent with apixaban: major 2.1% vs. 3.1% (HR: 0.69, 95% CI: 0.60 to 0.80), and combined major and minor bleeding 4.1% vs. 6.0% (HR: 0.68, 95% CI: 0.61 to 0.75). The annual rate for the net outcome of stroke, embolism, or major bleeding was 3.2% with apixaban and 4.1% with warfarin (HR: 0.77, 95% CI: 0.69 to 0.86).
Bottom line: Compared with warfarin therapy, apixaban is more effective and safer for stroke prevention in patients with atrial fibrillation.
Citation: Granger CB, Alexander JH, McMurray JJ, et al. Apixaban versus warfarin in patients with atrial fibrillation. N Engl J Med. 2011;365:981-992.
Ultrasonography Is Useful in Diagnosis of Pneumothorax
Clinical question: Is transthoracic ultrasonography a useful tool to diagnose pneumothorax?
Background: CT is the diagnostic gold standard for pneumothorax, but it is associated with radiation exposure and requires patient transport. Chest radiograph is easy to perform but may be too insensitive for adequate diagnosis. Ultrasonography’s diagnostic performance for detecting pneumothorax needs further evaluation.
Study design: Systematic review and meta-analysis.
Setting: Critically ill, trauma, or post-biopsy patients were identified in each of the studies.
Synopsis: The meta-analysis of 20 eligible studies found a pooled sensitivity of ultrasound for the detection of pneumothorax of 0.88 (95% CI: 0.85 to 0.91) and specificity of 0.99 (0.98 to 0.99) compared with sensitivity of 0.52 (0.49 to 0.55) and specificity of 1.00 (1.00 to 1.00) for chest radiograph. Although the overall ROC curve was not significantly different between these modalities, the accuracy of ultrasonography was highly dependent on the skill of the operator.
Bottom line: When performed by a skilled operator, transthoracic ultrasonography is as specific, and more sensitive, than chest radiograph in diagnosing pneumothorax.
Citation: Ding W, Shen Y, Yang J, He X, Zhang M. Diagnosis of pneumothorax by radiography and ultrasonography: a meta-analysis. Chest. 2011;140:859-866.
Risk Prediction for Hospital Readmission Remains Challenging
Clinical question: Can readmission risk assessment be used to identify which patients would benefit most from care-transition interventions, or to risk-adjust readmission rates for hospital comparison?
Background: Multiple models to predict hospital readmission have been described and validated. Identifying patients at high risk for readmission could allow for customized care-transition interventions, or could be used to risk-adjust readmission rates to compare publicly reported rates by hospital.
Study design: Systematic review with qualitative synthesis of results.
Setting: Thirty studies (23 from the U.S.) tested 26 unique readmission models.
Synopsis: Each model had been tested in both a derivation and validation cohort. Fourteen models (nine from the U.S.), using retrospective administrative data to compare risk-adjusted rates between hospitals, had poor discriminative capacity (c statistic range: 0.55 to 0.65). Seven models could be used to identify high-risk patients early in the hospitalization (c statistic range: 0.56 to 0.72) and five could be used to identify high-risk patients at discharge (c statistic range: 0.68 to 0.83), but these also had poor to moderate discriminative capacity. Multiple variables were considered in each of the models; most incorporated medical comorbidities and prior use of healthcare services.
Bottom line: Current readmission risk prediction models do not perform adequately for comparative or clinical purposes.
Citation: Kansagara D, Englander H, Salanitro A, et. al. Risk prediction models for hospital readmission: a systematic review. JAMA. 2011;306:1688-1698.
Intravenous Fluids for Acute Pancreatitis: More May Be Less
Clinical question: What is the optimal volume of fluid administration for treatment of acute pancreatitis?
Background: Current guidelines for management of acute pancreatitis emphasize vigorous administration of intravenous fluids to reduce the risk of pancreatic necrosis and organ failure. This recommendation is based upon animal studies, and has not been subjected to clinical evaluation in humans.
Study design: Prospective observational cohort.
Setting: University-affiliated tertiary-care public hospital in Spain.
Synopsis: This study enrolled 247 patients admitted with acute pancreatitis to determine the association between the volume of fluid administered during the first 24 hours and the development of persistent organ failure, pancreatic fluid collection or necrosis, and mortality. The volume and rate of fluid administered were determined by the treating physician. Patients were classified into three groups: those receiving a volume <3.1 L, 3.1 to 4.1 L, or >4.1 L.
After multivariate adjustment, those receiving <3.1 L had no increased risk of necrosis or any other adverse outcome, compared with those who received the middle range of fluid volume.
Patients receiving >4.1 L had a higher risk of persistent organ failure (OR: 7.7, 95% CI: 1.5 to 38.7), particularly renal and respiratory insufficiency, and fluid collection development (OR: 1.9, 95% CI: 1 to 3.7) independent of disease severity. Pancreatic necrosis and mortality were similar in the three groups.
Bottom line: Administration of large-volume intravenous fluids (>4.1 L) in
the first 24 hours was associated with worse outcomes, although residual confounding cannot be excluded in this nonrandomized study.
Citation: de-Madaria E, Soler-Sala G, Sanchez-Paya J, et al. Influence of fluid therapy on the prognosis of acute pancreatitis: a prospective cohort study. Am J Gastroenterol. 2011;106:1843-1850.
Clinical Outcomes in Saddle Pulmonary Embolism
Clinical question: What are the treatments used and outcomes associated with saddle pulmonary embolism?
Background: Saddle pulmonary embolism is a risk for right ventricular dysfunction and sudden hemodynamic collapse. There are limited data on the clinical presentation and outcomes in these patients.
Study design: Retrospective case review.
Setting: Single academic medical center.
Synopsis: In this retrospective review of 680 patients diagnosed with pulmonary embolism on CT at a single academic medical center from 2004 to 2009, 5.4% (37 patients) had a saddle pulmonary embolism.
Most patients with saddle pulmonary embolism were hemodynamically stable and responded to standard therapy with unfractionated heparin. The mean length of stay was nine days, 46% received an inferior vena cava filter, 41% were treated in an ICU, and 5.4% (two patients) died in the hospital. Thrombolytics were used in only 11% of patients, most of which had sustained hypotension and/or were mechanically ventilated.
Bottom line: Most patients with saddle pulmonary embolus in this single institution study did not receive thrombolytics and had overall low mortality.
Citation: Sardi A, Gluskin J, Guttentag A, Kotler MN, Braitman LE, Lippmann M. Saddle pulmonary embolism: is it as bad as it looks? A community hospital experience. Crit Care Med. 2011;39:2413-2418.
ITL: Physician Reviews of HM-Relevant Research
Clinical question: Is the risk of recurrence of Clostridium difficile infection (CDI) increased by the use of "non-CDI" antimicrobial agents (inactive against C. diff) during or after CDI therapy?
Background: Recurrence of CDI is expected to increase with use of non-CDI antimicrobials. Previous studies have not distinguished between the timing of non-CDI agents during and after CDI treatment, nor examined the effect of frequency, duration, or type of non-CDI antibiotic therapy.
Study design: Retrospective cohort.
Setting: Academic Veterans Affairs medical center.
Synopsis: All patients with CDI over a three-year period were evaluated to determine the association between non-CDI antimicrobial during or within 30 days following CDI therapy and 90-day CDI recurrence. Of 246 patients, 57% received concurrent or subsequent non-CDI antimicrobials. CDI recurred in 40% of patients who received non-CDI antimicrobials and in 16% of those who did not (OR: 3.5, 95% CI: 1.9 to 6.5).
After multivariable adjustment (including age, duration of CDI treatment, comorbidity, hospital and ICU admission, and gastric acid suppression), those who received non-CDI antimicrobials during CDI therapy had no increased risk of recurrence. However, those who received any non-CDI antimicrobials after initial CDI treatment had an absolute recurrence rate of 48% with an adjusted OR of 3.02 (95% CI: 1.65 to 5.52). This increased risk of recurrence was unaffected by the number or duration of non-CDI antimicrobial prescriptions. Subgroup analysis by antimicrobial class revealed statistically significant associations only with beta-lactams and fluoroquinolones.
Bottom line: The risk of recurrence of CDI is tripled by exposure to non-CDI antimicrobials within 30 days after CDI treatment, irrespective of the number or duration of such exposures.
Citation: Drekonja DM, Amundson WH, DeCarolis DD, Kuskowski MA, Lederle FA, Johnson JR. Antimicrobial use and risk for recurrent Clostridium difficile infection. Am J Med. 2011;124:1081.e1-1081.e7.
Clinical question: Is the risk of recurrence of Clostridium difficile infection (CDI) increased by the use of "non-CDI" antimicrobial agents (inactive against C. diff) during or after CDI therapy?
Background: Recurrence of CDI is expected to increase with use of non-CDI antimicrobials. Previous studies have not distinguished between the timing of non-CDI agents during and after CDI treatment, nor examined the effect of frequency, duration, or type of non-CDI antibiotic therapy.
Study design: Retrospective cohort.
Setting: Academic Veterans Affairs medical center.
Synopsis: All patients with CDI over a three-year period were evaluated to determine the association between non-CDI antimicrobial during or within 30 days following CDI therapy and 90-day CDI recurrence. Of 246 patients, 57% received concurrent or subsequent non-CDI antimicrobials. CDI recurred in 40% of patients who received non-CDI antimicrobials and in 16% of those who did not (OR: 3.5, 95% CI: 1.9 to 6.5).
After multivariable adjustment (including age, duration of CDI treatment, comorbidity, hospital and ICU admission, and gastric acid suppression), those who received non-CDI antimicrobials during CDI therapy had no increased risk of recurrence. However, those who received any non-CDI antimicrobials after initial CDI treatment had an absolute recurrence rate of 48% with an adjusted OR of 3.02 (95% CI: 1.65 to 5.52). This increased risk of recurrence was unaffected by the number or duration of non-CDI antimicrobial prescriptions. Subgroup analysis by antimicrobial class revealed statistically significant associations only with beta-lactams and fluoroquinolones.
Bottom line: The risk of recurrence of CDI is tripled by exposure to non-CDI antimicrobials within 30 days after CDI treatment, irrespective of the number or duration of such exposures.
Citation: Drekonja DM, Amundson WH, DeCarolis DD, Kuskowski MA, Lederle FA, Johnson JR. Antimicrobial use and risk for recurrent Clostridium difficile infection. Am J Med. 2011;124:1081.e1-1081.e7.
Clinical question: Is the risk of recurrence of Clostridium difficile infection (CDI) increased by the use of "non-CDI" antimicrobial agents (inactive against C. diff) during or after CDI therapy?
Background: Recurrence of CDI is expected to increase with use of non-CDI antimicrobials. Previous studies have not distinguished between the timing of non-CDI agents during and after CDI treatment, nor examined the effect of frequency, duration, or type of non-CDI antibiotic therapy.
Study design: Retrospective cohort.
Setting: Academic Veterans Affairs medical center.
Synopsis: All patients with CDI over a three-year period were evaluated to determine the association between non-CDI antimicrobial during or within 30 days following CDI therapy and 90-day CDI recurrence. Of 246 patients, 57% received concurrent or subsequent non-CDI antimicrobials. CDI recurred in 40% of patients who received non-CDI antimicrobials and in 16% of those who did not (OR: 3.5, 95% CI: 1.9 to 6.5).
After multivariable adjustment (including age, duration of CDI treatment, comorbidity, hospital and ICU admission, and gastric acid suppression), those who received non-CDI antimicrobials during CDI therapy had no increased risk of recurrence. However, those who received any non-CDI antimicrobials after initial CDI treatment had an absolute recurrence rate of 48% with an adjusted OR of 3.02 (95% CI: 1.65 to 5.52). This increased risk of recurrence was unaffected by the number or duration of non-CDI antimicrobial prescriptions. Subgroup analysis by antimicrobial class revealed statistically significant associations only with beta-lactams and fluoroquinolones.
Bottom line: The risk of recurrence of CDI is tripled by exposure to non-CDI antimicrobials within 30 days after CDI treatment, irrespective of the number or duration of such exposures.
Citation: Drekonja DM, Amundson WH, DeCarolis DD, Kuskowski MA, Lederle FA, Johnson JR. Antimicrobial use and risk for recurrent Clostridium difficile infection. Am J Med. 2011;124:1081.e1-1081.e7.
In the Literature
Thrombocytopenia Reaction to Vancomycin
Von Drygalski A, Curtis BR, Bougie DW, et al. Vancomycin-induced immune thrombocytopenia. N Engl J Med. 2007 Mar 1;356(9):904-910
The use of vancomycin has grown exponentially in the past 20 years.1 Physicians have become increasingly aware of its major side effects, such as red man syndrome, hypersensitivity, neutropenia, and nephrotoxicity. But there have been only a few case reports of thrombocytopenia associated with this drug. This article looked at cases of thrombocytopenia in patients referred for clinical suspicion of vancomycin-induced thrombocytopenia.
From 2001-2005, serum samples were sent to the Platelet and Neutrophil Immunology Laboratory at the BloodCenter of Wisconsin in Milwaukee for testing for vancomycin-dependent antibodies from several sites. Clinical information regarding these patients was obtained from their referring physicians and one of the authors. Platelet reactive antibodies were detected by flow cytometry.
IgG and IgM vancomycin-dependent antibodies were detected in 34 patients. It was found that platelets dropped an average of 93% from pretreatment levels, and the average nadir occurred on day eight. The mean platelet count was 13,600. After vancomycin was discontinued, the platelet count returned to normal in all patients except for the three who died. The average time for resolution of thrombocytopenia was 7.5 days.
Unlike other drug-induced thrombocytopenia, these cases of thrombocytopenia associated with vancomycin appear to be more prone to significant hemorrhage. In this group 34% were found to have had severe hemorrhage defined in this study as florid petechial hemorrhages, ecchymoses, and oozing form the buccal mucosa. Three patients who had renal insufficiency were found to be profoundly thrombocytopenic for a longer duration, presumably due to delayed clearance of vancomycin in this setting.
Based on this study, it appears thrombocytopenia is a significant adverse reaction that can be attributed to vancomycin. Unlike other drug-induced thrombocytopenias, it appears to be associated with a higher likelihood of significant hemorrhage, as well.
Thrombocytopenia is a common occurrence in the acutely ill hospitalized patient and has been linked to increased hospital mortality and increased length of stay.2 Many drugs and diseases that hospitalists treat are associated with thrombocytopenia. The indications for usage of vancomycin continues to grow with the increasing number of patients with prosthetic devices and intravascular access, and the increasing prevalence of MRSA. This study raises awareness of a significant side effect that can be associated with vancomycin.
References
- Ena J, Dick RW, Jones RN, et al. The epidemiology of intravenous vancomycin usage in a university hospital: a 10-year study. JAMA. 1993 Feb 3;269(5):598-602. Comment in JAMA. 1993 Sep 22-29;270(12):1426.
- Crowther MA, Cook DJ, Meade M, et al. Thrombocytopenia in medical-surgical critically ill patients: prevalence, incidence, and risk factors. J Crit Care. 2005 Dec;20(4):248-253.
Can the mBRS Stratify Pts Admitted for Nonvariceal Upper GI Bleeds?
Romagnuolo J, Barkun AN, Enns R, et al. Simple clinical predictors may obviate urgent endoscopy in selected patients with nonvariceal upper gastrointestinal tract bleeding. Arch Intern Med. 2007 Feb 12;167(3):265-270.
Nonvariceal upper gastrointestinal bleeding is one of the top 10 admission diagnoses based on reviews of diagnosis-related groups. Patients with low-risk lesions on endoscopy, such as ulcers with a clean base, esophagitis, gastritis, duodenitis, or Mallory-Weiss tears, are felt to have less than a 5% chance of recurrent bleeding. In some instances, these patients can be treated successfully and discharged to home.1
Unfortunately, endoscopy is not always available—especially late at night and on weekends. It would be helpful to have a clinical prediction rule to identify patients at low risk for bleeding who could be safely discharged to get endoscopy within a few days.
In the study, 1,869 patients who had undergone upper endoscopy for upper gastrointestinal bleeding were entered into a Canadian national Registry for Upper GI Bleeding and Endoscopy (RUGBE). A modified Blatchford risk score (mBRS) was calculated to see if it could predict the presence of high-risk stigmata of bleeding, rebleeding rates, and mortality.
This mBRS was also compared with another scoring system—the Rockall score. The mBRS uses clinical and laboratory data to risk assess nonvariceal bleeding. The variables included in the scoring system include hemoglobin, systolic blood pressure, heart rate, melena, liver disease, and heart failure. High-risk endoscopic stigmata were defined as adherent clot after irrigation, a bleeding, oozing or spurting vessel, or a nonbleeding visible vessel. Rebleeding was defined as hematemesis, melena, or a bloody nasogastric aspirate in the presence of shock or a decrease in hemoglobin of 2 g/dL or more.
Patients who had a modified Blatchford risk score of <1 were found to have a lower likelihood of high-risk stigmata on endoscopy and were at a low risk for rebleeding (5%). Patients who had high-risk stigmata on endoscopy but an mBRS score of <1 were also found to have low rebleeding rates. The mBRS seemed to a better predictor than the Rockall score for high-risk stigmata and for rebleeding rates.
Patients with nonvariceal upper gastrointestinal tract bleeding may be identified as low risk for re-bleeding if they are normotensive, not tachycardic, not anemic, and do not have active melena, liver disease, or heart failure. It is conceivable that if endoscopy were not available, these patients could be sent home on high-dose proton pump inhibitor and asked to return for outpatient upper endoscopy within a few days.
The study certainly raises interesting questions. Whether it is acceptable practice to discharge a “low-risk” patient with an upper gastrointestinal hemorrhage on a high-dose proton pump inhibitor with good social support and close outpatient follow-up, but without diagnostic endoscopy is still unclear.
The study is limited by the fact that it is a retrospective analysis; however, it does examine a large cohort of patients. The authors acknowledge this, and this work could lead to a prospective randomized trial that would help answer this question. In the meantime, the mBRS may be a helpful tool to help risk stratify patients admitted for nonvariceal upper gastrointestinal bleeding.
References
- Cipolletta L, Bianco M, Rotondano G, et al. Outpatient management for low-risk nonvariceal upper GI bleeding: a randomized controlled trial. Gastrointest Endosc. 2002;55(1):1-5.
Lumbar Puncture to Reduce Adverse Events
Straus SE, Thorpe KE, Holroyd-Leduc J. How do I perform a lumbar puncture and analyze the results to diagnose bacterial meningitis? JAMA. 2006 Oct 25;296(16):2012-2022.
Lumbar punctures (LPs) remain a common diagnostic test performed by physicians to rule out meningitis. This procedure may be associated with adverse events, with headache and backache the most commonly reported. This systematic review and meta-analysis sought to review the evidence regarding diagnostic lumbar puncture techniques that might reduce the risk of adverse events, and to examine the accuracy of cerebrospinal fluid (CSF) analysis in the diagnosis of bacterial meningitis.
Studies were identified through searches of the Cochrane Library (www3.interscience.wiley.com/cgi-bin/mrwhome/106568753/AboutCochrane.html), MEDLINE from 1966 to January 2006, and EMBASE from 1980 to January 2006 (without language restrictions) to identify relevant studies. Bibliographies of retrieved articles were also used as data sources.
Randomized controlled trials of patients 18 or older undergoing lumbar puncture testing interventions to facilitate a successful diagnostic procedure or reduce adverse events were identified and selected. As a secondary outcome, trials that assessed the accuracy of CSF biochemical analysis for the diagnosis of bacterial meningitis were also identified and included. Trials that studied spinal anesthesia or myelography were excluded.
Study appraisals for quality (randomization, blinding, and outcome assessment) and data extraction were performed by two investigators independently. Fifteen randomized trials of interventions to reduce adverse events met criteria for inclusion, and four studies of the diagnostic test characteristics of CSF analysis met criteria and were included.
Meta-analysis with a random effects model of five studies (total of 587 patients) comparing atraumatic needles with standard needles yielded a nonsignificant decrease in the odds of headache with an atraumatic needle (absolute risk reduction [ARR], 12.3%; 95% confidence interval [CI], –1.72% to 26.2%). A single study of reinsertion of the stylet before needle removal (600 patients) showed a decreased risk of headache (ARR, 11.3%; 95% CI, 6.50%-16.2%). Meta-analysis of four studies (717 patients) revealed a nonsignificant decrease in headache in patients mobilized after LP (ARR 2.9%; 95% CI, –3.4 to 9.3%).
Data from the diagnostic test studies yielded the following likelihood ratios for diagnosing bacterial meningitis: A CSF–blood glucose ratio of 0.4 or less with a likelihood ratio of 18 (95% CI, 12-27); CSF white blood cell count of 500/µL or higher with a likelihood ratio of 15 (95% CI, 10-22); and CSF lactate level of >31.53 mg/dL with a likelihood ration of 21 (95% CI, 14-32) in accurately diagnosed bacterial meningitis.
These data support the reinsertion of the stylet before needle removal to reduce the risk of headache after lumbar puncture and that patients do not require bed rest after diagnostic lumbar puncture. Biochemical analyses, including CSF-blood glucose ratio, CSF leukocyte count and lactate level are useful in diagnosing bacterial meningitis.
This Rational Clinical Examination systematic review and meta-analysis provides a nice review of the available data on optimizing diagnostic lumbar puncture technique to reduce adverse events. It is somewhat remarkable so little has changed in our knowledge about this long-standing diagnostic procedure. Post-lumbar puncture headaches remain a challenge that may affect patient satisfaction as well as hospital (or observation unit) course particularly for patients who do not have evidence of bacterial meningitis once the analysis is complete.
This review seems to provide some useful answers for physicians performing lumbar puncture, who should consider selecting a small gauge needle and reinserting the stylet prior to removal. Future studies of other maneuvers to reduce post-procedure adverse events should be considered for the question of atraumatic needles, which may be technically more difficult to use. The review confirms and helps quantify the utility of CSF biochemical analysis in the diagnosis of bacterial meningitis.
Who’s Performing Procedures?
Wigton RS, Alguire P. The declining number and variety of procedures done by general internists: a resurvey of members of the American College of Physicians. Ann Intern Med. 2007 Mar 6;146(5):355-360. Comment in Ann Intern Med. 2007 Mar 6; 146(5):392-393.
Prior surveys of physicians documented that general internists performed a variety and significant number of procedures in their practice. Much has changed since those prior assessments, including physician training, practice settings, availability of subspecialists, and regulatory requirements that have altered physician’s practice with regard to procedures. This study sought to reassess the volume and variety of procedures performed by general internists compared with the prior survey of 1986. The final sample included 990 completed surveys from general internists from 1,389 returned questionnaires for a successful completion rate of 39.6%.
The median number of different procedures performed in practice decreased from 16 in 1986 to seven in 2004. Internists who practiced in smaller hospitals or smaller towns reported performing almost twice as many procedures as physicians in the largest hospitals and cities. Hours spent in the care of hospitalized patients were also associated with an increased number of different procedures—in particular mechanical ventilation, central venous catheter placement, and thoracentesis. For all but one of the 34 procedures common to both surveys, fewer general internists performed them in 2004 compared with 1986. Remarkably, for 22 of the 34 procedures, a greater than 50% reduction in the proportion of respondents who performed the procedure was noted.
In the 1986 survey, the majority of internists performed all but one of the six procedures required by the American Board of Internal Medicine (ABIM) for certification (abdominal paracentesis, arterial puncture for blood gases, central venous catheter placement, joint aspiration, lumbar puncture, and thoracentesis). Except for joint aspiration, in 2004 these required procedures were performed by 25% or fewer of the respondents.
The 2004 survey demonstrated a striking reduction in the number of different procedures performed by general internists, and a decrease in the proportion of internists who do most procedures. These reductions may stem from a variety of changes in physician practices, including the emergence of hospitalists, availability of subspecialty physicians and proceduralists, and changes in technology and regulatory environments.
Regardless of the forces behind these changes, internal medicine residents’ training in procedures should be re-examined.
Many of those in academic hospital medicine have noted a decline in procedures performed by general internists at large academic centers. This study affirms this trend overall and in particular for physicians in large urban settings or in the largest hospitals. The emergence of hospital medicine may have played a role in reducing the procedures performed by primary care (outpatient) physicians who now spend less time caring for medically ill hospitalized patients.
Residency programs now must consider how to incorporate procedure skills and training to align with the needs of internists. The rising interest in careers in hospital medicine (as opposed to outpatient primary care) necessitates a new approach and individualized plans for gaining procedural skills to match career goals and practice settings. The new ABIM policy acknowledges this greater variability in the procedures performed by internists in practice, and takes steps to more closely align procedure requirements and core manual skills with physician practice.
These changes and new flexibility in requirements provide another opportunity for academic hospital medicine programs to provide leadership, and help shape the training of inpatient physicians. TH
Thrombocytopenia Reaction to Vancomycin
Von Drygalski A, Curtis BR, Bougie DW, et al. Vancomycin-induced immune thrombocytopenia. N Engl J Med. 2007 Mar 1;356(9):904-910
The use of vancomycin has grown exponentially in the past 20 years.1 Physicians have become increasingly aware of its major side effects, such as red man syndrome, hypersensitivity, neutropenia, and nephrotoxicity. But there have been only a few case reports of thrombocytopenia associated with this drug. This article looked at cases of thrombocytopenia in patients referred for clinical suspicion of vancomycin-induced thrombocytopenia.
From 2001-2005, serum samples were sent to the Platelet and Neutrophil Immunology Laboratory at the BloodCenter of Wisconsin in Milwaukee for testing for vancomycin-dependent antibodies from several sites. Clinical information regarding these patients was obtained from their referring physicians and one of the authors. Platelet reactive antibodies were detected by flow cytometry.
IgG and IgM vancomycin-dependent antibodies were detected in 34 patients. It was found that platelets dropped an average of 93% from pretreatment levels, and the average nadir occurred on day eight. The mean platelet count was 13,600. After vancomycin was discontinued, the platelet count returned to normal in all patients except for the three who died. The average time for resolution of thrombocytopenia was 7.5 days.
Unlike other drug-induced thrombocytopenia, these cases of thrombocytopenia associated with vancomycin appear to be more prone to significant hemorrhage. In this group 34% were found to have had severe hemorrhage defined in this study as florid petechial hemorrhages, ecchymoses, and oozing form the buccal mucosa. Three patients who had renal insufficiency were found to be profoundly thrombocytopenic for a longer duration, presumably due to delayed clearance of vancomycin in this setting.
Based on this study, it appears thrombocytopenia is a significant adverse reaction that can be attributed to vancomycin. Unlike other drug-induced thrombocytopenias, it appears to be associated with a higher likelihood of significant hemorrhage, as well.
Thrombocytopenia is a common occurrence in the acutely ill hospitalized patient and has been linked to increased hospital mortality and increased length of stay.2 Many drugs and diseases that hospitalists treat are associated with thrombocytopenia. The indications for usage of vancomycin continues to grow with the increasing number of patients with prosthetic devices and intravascular access, and the increasing prevalence of MRSA. This study raises awareness of a significant side effect that can be associated with vancomycin.
References
- Ena J, Dick RW, Jones RN, et al. The epidemiology of intravenous vancomycin usage in a university hospital: a 10-year study. JAMA. 1993 Feb 3;269(5):598-602. Comment in JAMA. 1993 Sep 22-29;270(12):1426.
- Crowther MA, Cook DJ, Meade M, et al. Thrombocytopenia in medical-surgical critically ill patients: prevalence, incidence, and risk factors. J Crit Care. 2005 Dec;20(4):248-253.
Can the mBRS Stratify Pts Admitted for Nonvariceal Upper GI Bleeds?
Romagnuolo J, Barkun AN, Enns R, et al. Simple clinical predictors may obviate urgent endoscopy in selected patients with nonvariceal upper gastrointestinal tract bleeding. Arch Intern Med. 2007 Feb 12;167(3):265-270.
Nonvariceal upper gastrointestinal bleeding is one of the top 10 admission diagnoses based on reviews of diagnosis-related groups. Patients with low-risk lesions on endoscopy, such as ulcers with a clean base, esophagitis, gastritis, duodenitis, or Mallory-Weiss tears, are felt to have less than a 5% chance of recurrent bleeding. In some instances, these patients can be treated successfully and discharged to home.1
Unfortunately, endoscopy is not always available—especially late at night and on weekends. It would be helpful to have a clinical prediction rule to identify patients at low risk for bleeding who could be safely discharged to get endoscopy within a few days.
In the study, 1,869 patients who had undergone upper endoscopy for upper gastrointestinal bleeding were entered into a Canadian national Registry for Upper GI Bleeding and Endoscopy (RUGBE). A modified Blatchford risk score (mBRS) was calculated to see if it could predict the presence of high-risk stigmata of bleeding, rebleeding rates, and mortality.
This mBRS was also compared with another scoring system—the Rockall score. The mBRS uses clinical and laboratory data to risk assess nonvariceal bleeding. The variables included in the scoring system include hemoglobin, systolic blood pressure, heart rate, melena, liver disease, and heart failure. High-risk endoscopic stigmata were defined as adherent clot after irrigation, a bleeding, oozing or spurting vessel, or a nonbleeding visible vessel. Rebleeding was defined as hematemesis, melena, or a bloody nasogastric aspirate in the presence of shock or a decrease in hemoglobin of 2 g/dL or more.
Patients who had a modified Blatchford risk score of <1 were found to have a lower likelihood of high-risk stigmata on endoscopy and were at a low risk for rebleeding (5%). Patients who had high-risk stigmata on endoscopy but an mBRS score of <1 were also found to have low rebleeding rates. The mBRS seemed to a better predictor than the Rockall score for high-risk stigmata and for rebleeding rates.
Patients with nonvariceal upper gastrointestinal tract bleeding may be identified as low risk for re-bleeding if they are normotensive, not tachycardic, not anemic, and do not have active melena, liver disease, or heart failure. It is conceivable that if endoscopy were not available, these patients could be sent home on high-dose proton pump inhibitor and asked to return for outpatient upper endoscopy within a few days.
The study certainly raises interesting questions. Whether it is acceptable practice to discharge a “low-risk” patient with an upper gastrointestinal hemorrhage on a high-dose proton pump inhibitor with good social support and close outpatient follow-up, but without diagnostic endoscopy is still unclear.
The study is limited by the fact that it is a retrospective analysis; however, it does examine a large cohort of patients. The authors acknowledge this, and this work could lead to a prospective randomized trial that would help answer this question. In the meantime, the mBRS may be a helpful tool to help risk stratify patients admitted for nonvariceal upper gastrointestinal bleeding.
References
- Cipolletta L, Bianco M, Rotondano G, et al. Outpatient management for low-risk nonvariceal upper GI bleeding: a randomized controlled trial. Gastrointest Endosc. 2002;55(1):1-5.
Lumbar Puncture to Reduce Adverse Events
Straus SE, Thorpe KE, Holroyd-Leduc J. How do I perform a lumbar puncture and analyze the results to diagnose bacterial meningitis? JAMA. 2006 Oct 25;296(16):2012-2022.
Lumbar punctures (LPs) remain a common diagnostic test performed by physicians to rule out meningitis. This procedure may be associated with adverse events, with headache and backache the most commonly reported. This systematic review and meta-analysis sought to review the evidence regarding diagnostic lumbar puncture techniques that might reduce the risk of adverse events, and to examine the accuracy of cerebrospinal fluid (CSF) analysis in the diagnosis of bacterial meningitis.
Studies were identified through searches of the Cochrane Library (www3.interscience.wiley.com/cgi-bin/mrwhome/106568753/AboutCochrane.html), MEDLINE from 1966 to January 2006, and EMBASE from 1980 to January 2006 (without language restrictions) to identify relevant studies. Bibliographies of retrieved articles were also used as data sources.
Randomized controlled trials of patients 18 or older undergoing lumbar puncture testing interventions to facilitate a successful diagnostic procedure or reduce adverse events were identified and selected. As a secondary outcome, trials that assessed the accuracy of CSF biochemical analysis for the diagnosis of bacterial meningitis were also identified and included. Trials that studied spinal anesthesia or myelography were excluded.
Study appraisals for quality (randomization, blinding, and outcome assessment) and data extraction were performed by two investigators independently. Fifteen randomized trials of interventions to reduce adverse events met criteria for inclusion, and four studies of the diagnostic test characteristics of CSF analysis met criteria and were included.
Meta-analysis with a random effects model of five studies (total of 587 patients) comparing atraumatic needles with standard needles yielded a nonsignificant decrease in the odds of headache with an atraumatic needle (absolute risk reduction [ARR], 12.3%; 95% confidence interval [CI], –1.72% to 26.2%). A single study of reinsertion of the stylet before needle removal (600 patients) showed a decreased risk of headache (ARR, 11.3%; 95% CI, 6.50%-16.2%). Meta-analysis of four studies (717 patients) revealed a nonsignificant decrease in headache in patients mobilized after LP (ARR 2.9%; 95% CI, –3.4 to 9.3%).
Data from the diagnostic test studies yielded the following likelihood ratios for diagnosing bacterial meningitis: A CSF–blood glucose ratio of 0.4 or less with a likelihood ratio of 18 (95% CI, 12-27); CSF white blood cell count of 500/µL or higher with a likelihood ratio of 15 (95% CI, 10-22); and CSF lactate level of >31.53 mg/dL with a likelihood ration of 21 (95% CI, 14-32) in accurately diagnosed bacterial meningitis.
These data support the reinsertion of the stylet before needle removal to reduce the risk of headache after lumbar puncture and that patients do not require bed rest after diagnostic lumbar puncture. Biochemical analyses, including CSF-blood glucose ratio, CSF leukocyte count and lactate level are useful in diagnosing bacterial meningitis.
This Rational Clinical Examination systematic review and meta-analysis provides a nice review of the available data on optimizing diagnostic lumbar puncture technique to reduce adverse events. It is somewhat remarkable so little has changed in our knowledge about this long-standing diagnostic procedure. Post-lumbar puncture headaches remain a challenge that may affect patient satisfaction as well as hospital (or observation unit) course particularly for patients who do not have evidence of bacterial meningitis once the analysis is complete.
This review seems to provide some useful answers for physicians performing lumbar puncture, who should consider selecting a small gauge needle and reinserting the stylet prior to removal. Future studies of other maneuvers to reduce post-procedure adverse events should be considered for the question of atraumatic needles, which may be technically more difficult to use. The review confirms and helps quantify the utility of CSF biochemical analysis in the diagnosis of bacterial meningitis.
Who’s Performing Procedures?
Wigton RS, Alguire P. The declining number and variety of procedures done by general internists: a resurvey of members of the American College of Physicians. Ann Intern Med. 2007 Mar 6;146(5):355-360. Comment in Ann Intern Med. 2007 Mar 6; 146(5):392-393.
Prior surveys of physicians documented that general internists performed a variety and significant number of procedures in their practice. Much has changed since those prior assessments, including physician training, practice settings, availability of subspecialists, and regulatory requirements that have altered physician’s practice with regard to procedures. This study sought to reassess the volume and variety of procedures performed by general internists compared with the prior survey of 1986. The final sample included 990 completed surveys from general internists from 1,389 returned questionnaires for a successful completion rate of 39.6%.
The median number of different procedures performed in practice decreased from 16 in 1986 to seven in 2004. Internists who practiced in smaller hospitals or smaller towns reported performing almost twice as many procedures as physicians in the largest hospitals and cities. Hours spent in the care of hospitalized patients were also associated with an increased number of different procedures—in particular mechanical ventilation, central venous catheter placement, and thoracentesis. For all but one of the 34 procedures common to both surveys, fewer general internists performed them in 2004 compared with 1986. Remarkably, for 22 of the 34 procedures, a greater than 50% reduction in the proportion of respondents who performed the procedure was noted.
In the 1986 survey, the majority of internists performed all but one of the six procedures required by the American Board of Internal Medicine (ABIM) for certification (abdominal paracentesis, arterial puncture for blood gases, central venous catheter placement, joint aspiration, lumbar puncture, and thoracentesis). Except for joint aspiration, in 2004 these required procedures were performed by 25% or fewer of the respondents.
The 2004 survey demonstrated a striking reduction in the number of different procedures performed by general internists, and a decrease in the proportion of internists who do most procedures. These reductions may stem from a variety of changes in physician practices, including the emergence of hospitalists, availability of subspecialty physicians and proceduralists, and changes in technology and regulatory environments.
Regardless of the forces behind these changes, internal medicine residents’ training in procedures should be re-examined.
Many of those in academic hospital medicine have noted a decline in procedures performed by general internists at large academic centers. This study affirms this trend overall and in particular for physicians in large urban settings or in the largest hospitals. The emergence of hospital medicine may have played a role in reducing the procedures performed by primary care (outpatient) physicians who now spend less time caring for medically ill hospitalized patients.
Residency programs now must consider how to incorporate procedure skills and training to align with the needs of internists. The rising interest in careers in hospital medicine (as opposed to outpatient primary care) necessitates a new approach and individualized plans for gaining procedural skills to match career goals and practice settings. The new ABIM policy acknowledges this greater variability in the procedures performed by internists in practice, and takes steps to more closely align procedure requirements and core manual skills with physician practice.
These changes and new flexibility in requirements provide another opportunity for academic hospital medicine programs to provide leadership, and help shape the training of inpatient physicians. TH
Thrombocytopenia Reaction to Vancomycin
Von Drygalski A, Curtis BR, Bougie DW, et al. Vancomycin-induced immune thrombocytopenia. N Engl J Med. 2007 Mar 1;356(9):904-910
The use of vancomycin has grown exponentially in the past 20 years.1 Physicians have become increasingly aware of its major side effects, such as red man syndrome, hypersensitivity, neutropenia, and nephrotoxicity. But there have been only a few case reports of thrombocytopenia associated with this drug. This article looked at cases of thrombocytopenia in patients referred for clinical suspicion of vancomycin-induced thrombocytopenia.
From 2001-2005, serum samples were sent to the Platelet and Neutrophil Immunology Laboratory at the BloodCenter of Wisconsin in Milwaukee for testing for vancomycin-dependent antibodies from several sites. Clinical information regarding these patients was obtained from their referring physicians and one of the authors. Platelet reactive antibodies were detected by flow cytometry.
IgG and IgM vancomycin-dependent antibodies were detected in 34 patients. It was found that platelets dropped an average of 93% from pretreatment levels, and the average nadir occurred on day eight. The mean platelet count was 13,600. After vancomycin was discontinued, the platelet count returned to normal in all patients except for the three who died. The average time for resolution of thrombocytopenia was 7.5 days.
Unlike other drug-induced thrombocytopenia, these cases of thrombocytopenia associated with vancomycin appear to be more prone to significant hemorrhage. In this group 34% were found to have had severe hemorrhage defined in this study as florid petechial hemorrhages, ecchymoses, and oozing form the buccal mucosa. Three patients who had renal insufficiency were found to be profoundly thrombocytopenic for a longer duration, presumably due to delayed clearance of vancomycin in this setting.
Based on this study, it appears thrombocytopenia is a significant adverse reaction that can be attributed to vancomycin. Unlike other drug-induced thrombocytopenias, it appears to be associated with a higher likelihood of significant hemorrhage, as well.
Thrombocytopenia is a common occurrence in the acutely ill hospitalized patient and has been linked to increased hospital mortality and increased length of stay.2 Many drugs and diseases that hospitalists treat are associated with thrombocytopenia. The indications for usage of vancomycin continues to grow with the increasing number of patients with prosthetic devices and intravascular access, and the increasing prevalence of MRSA. This study raises awareness of a significant side effect that can be associated with vancomycin.
References
- Ena J, Dick RW, Jones RN, et al. The epidemiology of intravenous vancomycin usage in a university hospital: a 10-year study. JAMA. 1993 Feb 3;269(5):598-602. Comment in JAMA. 1993 Sep 22-29;270(12):1426.
- Crowther MA, Cook DJ, Meade M, et al. Thrombocytopenia in medical-surgical critically ill patients: prevalence, incidence, and risk factors. J Crit Care. 2005 Dec;20(4):248-253.
Can the mBRS Stratify Pts Admitted for Nonvariceal Upper GI Bleeds?
Romagnuolo J, Barkun AN, Enns R, et al. Simple clinical predictors may obviate urgent endoscopy in selected patients with nonvariceal upper gastrointestinal tract bleeding. Arch Intern Med. 2007 Feb 12;167(3):265-270.
Nonvariceal upper gastrointestinal bleeding is one of the top 10 admission diagnoses based on reviews of diagnosis-related groups. Patients with low-risk lesions on endoscopy, such as ulcers with a clean base, esophagitis, gastritis, duodenitis, or Mallory-Weiss tears, are felt to have less than a 5% chance of recurrent bleeding. In some instances, these patients can be treated successfully and discharged to home.1
Unfortunately, endoscopy is not always available—especially late at night and on weekends. It would be helpful to have a clinical prediction rule to identify patients at low risk for bleeding who could be safely discharged to get endoscopy within a few days.
In the study, 1,869 patients who had undergone upper endoscopy for upper gastrointestinal bleeding were entered into a Canadian national Registry for Upper GI Bleeding and Endoscopy (RUGBE). A modified Blatchford risk score (mBRS) was calculated to see if it could predict the presence of high-risk stigmata of bleeding, rebleeding rates, and mortality.
This mBRS was also compared with another scoring system—the Rockall score. The mBRS uses clinical and laboratory data to risk assess nonvariceal bleeding. The variables included in the scoring system include hemoglobin, systolic blood pressure, heart rate, melena, liver disease, and heart failure. High-risk endoscopic stigmata were defined as adherent clot after irrigation, a bleeding, oozing or spurting vessel, or a nonbleeding visible vessel. Rebleeding was defined as hematemesis, melena, or a bloody nasogastric aspirate in the presence of shock or a decrease in hemoglobin of 2 g/dL or more.
Patients who had a modified Blatchford risk score of <1 were found to have a lower likelihood of high-risk stigmata on endoscopy and were at a low risk for rebleeding (5%). Patients who had high-risk stigmata on endoscopy but an mBRS score of <1 were also found to have low rebleeding rates. The mBRS seemed to a better predictor than the Rockall score for high-risk stigmata and for rebleeding rates.
Patients with nonvariceal upper gastrointestinal tract bleeding may be identified as low risk for re-bleeding if they are normotensive, not tachycardic, not anemic, and do not have active melena, liver disease, or heart failure. It is conceivable that if endoscopy were not available, these patients could be sent home on high-dose proton pump inhibitor and asked to return for outpatient upper endoscopy within a few days.
The study certainly raises interesting questions. Whether it is acceptable practice to discharge a “low-risk” patient with an upper gastrointestinal hemorrhage on a high-dose proton pump inhibitor with good social support and close outpatient follow-up, but without diagnostic endoscopy is still unclear.
The study is limited by the fact that it is a retrospective analysis; however, it does examine a large cohort of patients. The authors acknowledge this, and this work could lead to a prospective randomized trial that would help answer this question. In the meantime, the mBRS may be a helpful tool to help risk stratify patients admitted for nonvariceal upper gastrointestinal bleeding.
References
- Cipolletta L, Bianco M, Rotondano G, et al. Outpatient management for low-risk nonvariceal upper GI bleeding: a randomized controlled trial. Gastrointest Endosc. 2002;55(1):1-5.
Lumbar Puncture to Reduce Adverse Events
Straus SE, Thorpe KE, Holroyd-Leduc J. How do I perform a lumbar puncture and analyze the results to diagnose bacterial meningitis? JAMA. 2006 Oct 25;296(16):2012-2022.
Lumbar punctures (LPs) remain a common diagnostic test performed by physicians to rule out meningitis. This procedure may be associated with adverse events, with headache and backache the most commonly reported. This systematic review and meta-analysis sought to review the evidence regarding diagnostic lumbar puncture techniques that might reduce the risk of adverse events, and to examine the accuracy of cerebrospinal fluid (CSF) analysis in the diagnosis of bacterial meningitis.
Studies were identified through searches of the Cochrane Library (www3.interscience.wiley.com/cgi-bin/mrwhome/106568753/AboutCochrane.html), MEDLINE from 1966 to January 2006, and EMBASE from 1980 to January 2006 (without language restrictions) to identify relevant studies. Bibliographies of retrieved articles were also used as data sources.
Randomized controlled trials of patients 18 or older undergoing lumbar puncture testing interventions to facilitate a successful diagnostic procedure or reduce adverse events were identified and selected. As a secondary outcome, trials that assessed the accuracy of CSF biochemical analysis for the diagnosis of bacterial meningitis were also identified and included. Trials that studied spinal anesthesia or myelography were excluded.
Study appraisals for quality (randomization, blinding, and outcome assessment) and data extraction were performed by two investigators independently. Fifteen randomized trials of interventions to reduce adverse events met criteria for inclusion, and four studies of the diagnostic test characteristics of CSF analysis met criteria and were included.
Meta-analysis with a random effects model of five studies (total of 587 patients) comparing atraumatic needles with standard needles yielded a nonsignificant decrease in the odds of headache with an atraumatic needle (absolute risk reduction [ARR], 12.3%; 95% confidence interval [CI], –1.72% to 26.2%). A single study of reinsertion of the stylet before needle removal (600 patients) showed a decreased risk of headache (ARR, 11.3%; 95% CI, 6.50%-16.2%). Meta-analysis of four studies (717 patients) revealed a nonsignificant decrease in headache in patients mobilized after LP (ARR 2.9%; 95% CI, –3.4 to 9.3%).
Data from the diagnostic test studies yielded the following likelihood ratios for diagnosing bacterial meningitis: A CSF–blood glucose ratio of 0.4 or less with a likelihood ratio of 18 (95% CI, 12-27); CSF white blood cell count of 500/µL or higher with a likelihood ratio of 15 (95% CI, 10-22); and CSF lactate level of >31.53 mg/dL with a likelihood ration of 21 (95% CI, 14-32) in accurately diagnosed bacterial meningitis.
These data support the reinsertion of the stylet before needle removal to reduce the risk of headache after lumbar puncture and that patients do not require bed rest after diagnostic lumbar puncture. Biochemical analyses, including CSF-blood glucose ratio, CSF leukocyte count and lactate level are useful in diagnosing bacterial meningitis.
This Rational Clinical Examination systematic review and meta-analysis provides a nice review of the available data on optimizing diagnostic lumbar puncture technique to reduce adverse events. It is somewhat remarkable so little has changed in our knowledge about this long-standing diagnostic procedure. Post-lumbar puncture headaches remain a challenge that may affect patient satisfaction as well as hospital (or observation unit) course particularly for patients who do not have evidence of bacterial meningitis once the analysis is complete.
This review seems to provide some useful answers for physicians performing lumbar puncture, who should consider selecting a small gauge needle and reinserting the stylet prior to removal. Future studies of other maneuvers to reduce post-procedure adverse events should be considered for the question of atraumatic needles, which may be technically more difficult to use. The review confirms and helps quantify the utility of CSF biochemical analysis in the diagnosis of bacterial meningitis.
Who’s Performing Procedures?
Wigton RS, Alguire P. The declining number and variety of procedures done by general internists: a resurvey of members of the American College of Physicians. Ann Intern Med. 2007 Mar 6;146(5):355-360. Comment in Ann Intern Med. 2007 Mar 6; 146(5):392-393.
Prior surveys of physicians documented that general internists performed a variety and significant number of procedures in their practice. Much has changed since those prior assessments, including physician training, practice settings, availability of subspecialists, and regulatory requirements that have altered physician’s practice with regard to procedures. This study sought to reassess the volume and variety of procedures performed by general internists compared with the prior survey of 1986. The final sample included 990 completed surveys from general internists from 1,389 returned questionnaires for a successful completion rate of 39.6%.
The median number of different procedures performed in practice decreased from 16 in 1986 to seven in 2004. Internists who practiced in smaller hospitals or smaller towns reported performing almost twice as many procedures as physicians in the largest hospitals and cities. Hours spent in the care of hospitalized patients were also associated with an increased number of different procedures—in particular mechanical ventilation, central venous catheter placement, and thoracentesis. For all but one of the 34 procedures common to both surveys, fewer general internists performed them in 2004 compared with 1986. Remarkably, for 22 of the 34 procedures, a greater than 50% reduction in the proportion of respondents who performed the procedure was noted.
In the 1986 survey, the majority of internists performed all but one of the six procedures required by the American Board of Internal Medicine (ABIM) for certification (abdominal paracentesis, arterial puncture for blood gases, central venous catheter placement, joint aspiration, lumbar puncture, and thoracentesis). Except for joint aspiration, in 2004 these required procedures were performed by 25% or fewer of the respondents.
The 2004 survey demonstrated a striking reduction in the number of different procedures performed by general internists, and a decrease in the proportion of internists who do most procedures. These reductions may stem from a variety of changes in physician practices, including the emergence of hospitalists, availability of subspecialty physicians and proceduralists, and changes in technology and regulatory environments.
Regardless of the forces behind these changes, internal medicine residents’ training in procedures should be re-examined.
Many of those in academic hospital medicine have noted a decline in procedures performed by general internists at large academic centers. This study affirms this trend overall and in particular for physicians in large urban settings or in the largest hospitals. The emergence of hospital medicine may have played a role in reducing the procedures performed by primary care (outpatient) physicians who now spend less time caring for medically ill hospitalized patients.
Residency programs now must consider how to incorporate procedure skills and training to align with the needs of internists. The rising interest in careers in hospital medicine (as opposed to outpatient primary care) necessitates a new approach and individualized plans for gaining procedural skills to match career goals and practice settings. The new ABIM policy acknowledges this greater variability in the procedures performed by internists in practice, and takes steps to more closely align procedure requirements and core manual skills with physician practice.
These changes and new flexibility in requirements provide another opportunity for academic hospital medicine programs to provide leadership, and help shape the training of inpatient physicians. TH