User login
In the Literature: Research You Need to Know
Clinical question: Which clinical decision rule—Wells rule, simplified Wells rule, revised Geneva score, or simplified revised Geneva score—is the best for evaluating a patient with a possible acute pulmonary embolism?
Background: The use of standardized clinical decision rules to determine the probability of an acute pulmonary embolism (PE) has significantly improved the diagnostic evaluation of patients with suspected PE. Several clinical decision rules are available and widely used, but they have not been previously directly compared.
Study design: Prospective cohort.
Setting: Seven hospitals in the Netherlands.
Synopsis: A total of 807 patients with suspected first episode of acute PE had a sequential workup with clinical probability assessment and D-dimer testing. When PE was considered unlikely according to all four clinical decision rules and a normal D-dimer result, PE was excluded. In the remaining patients, a CT scan was used to confirm or exclude the diagnosis.
The prevalence of PE was 23%. Combined with a normal D-dimer, the decision rules excluded PE in 22% to 24% of patients. Thirty percent of patients had discordant decision rule outcomes, but PE was not detected by CT in any of these patients when combined with a normal D-dimer.
This study has practical limitations because management was based on a combination of four decision rules and D-dimer testing rather than only one rule and D-dimer testing, which is the more realistic clinical approach.
Bottom line: When used correctly and in conjunction with a D-dimer result, the Wells rule, simplified Wells rule, revised Geneva score, and simplified revised Geneva score all perform similarly in the exclusion of acute PE.
Citation: Douma RA, Mos IC, Erkens PM, et al. Performance of 4 clinical decision rules in the diagnostic management of acute pulmonary embolism: a prospective cohort study. Ann Intern Med. 2011;154:709-718.
For more of physician reviews of HM-related literature, check out this month's"In the Literature".
Clinical question: Which clinical decision rule—Wells rule, simplified Wells rule, revised Geneva score, or simplified revised Geneva score—is the best for evaluating a patient with a possible acute pulmonary embolism?
Background: The use of standardized clinical decision rules to determine the probability of an acute pulmonary embolism (PE) has significantly improved the diagnostic evaluation of patients with suspected PE. Several clinical decision rules are available and widely used, but they have not been previously directly compared.
Study design: Prospective cohort.
Setting: Seven hospitals in the Netherlands.
Synopsis: A total of 807 patients with suspected first episode of acute PE had a sequential workup with clinical probability assessment and D-dimer testing. When PE was considered unlikely according to all four clinical decision rules and a normal D-dimer result, PE was excluded. In the remaining patients, a CT scan was used to confirm or exclude the diagnosis.
The prevalence of PE was 23%. Combined with a normal D-dimer, the decision rules excluded PE in 22% to 24% of patients. Thirty percent of patients had discordant decision rule outcomes, but PE was not detected by CT in any of these patients when combined with a normal D-dimer.
This study has practical limitations because management was based on a combination of four decision rules and D-dimer testing rather than only one rule and D-dimer testing, which is the more realistic clinical approach.
Bottom line: When used correctly and in conjunction with a D-dimer result, the Wells rule, simplified Wells rule, revised Geneva score, and simplified revised Geneva score all perform similarly in the exclusion of acute PE.
Citation: Douma RA, Mos IC, Erkens PM, et al. Performance of 4 clinical decision rules in the diagnostic management of acute pulmonary embolism: a prospective cohort study. Ann Intern Med. 2011;154:709-718.
For more of physician reviews of HM-related literature, check out this month's"In the Literature".
Clinical question: Which clinical decision rule—Wells rule, simplified Wells rule, revised Geneva score, or simplified revised Geneva score—is the best for evaluating a patient with a possible acute pulmonary embolism?
Background: The use of standardized clinical decision rules to determine the probability of an acute pulmonary embolism (PE) has significantly improved the diagnostic evaluation of patients with suspected PE. Several clinical decision rules are available and widely used, but they have not been previously directly compared.
Study design: Prospective cohort.
Setting: Seven hospitals in the Netherlands.
Synopsis: A total of 807 patients with suspected first episode of acute PE had a sequential workup with clinical probability assessment and D-dimer testing. When PE was considered unlikely according to all four clinical decision rules and a normal D-dimer result, PE was excluded. In the remaining patients, a CT scan was used to confirm or exclude the diagnosis.
The prevalence of PE was 23%. Combined with a normal D-dimer, the decision rules excluded PE in 22% to 24% of patients. Thirty percent of patients had discordant decision rule outcomes, but PE was not detected by CT in any of these patients when combined with a normal D-dimer.
This study has practical limitations because management was based on a combination of four decision rules and D-dimer testing rather than only one rule and D-dimer testing, which is the more realistic clinical approach.
Bottom line: When used correctly and in conjunction with a D-dimer result, the Wells rule, simplified Wells rule, revised Geneva score, and simplified revised Geneva score all perform similarly in the exclusion of acute PE.
Citation: Douma RA, Mos IC, Erkens PM, et al. Performance of 4 clinical decision rules in the diagnostic management of acute pulmonary embolism: a prospective cohort study. Ann Intern Med. 2011;154:709-718.
For more of physician reviews of HM-related literature, check out this month's"In the Literature".
When Should a Patient with Ascites Receive Spontaneous Bacterial Peritonitis (SBP) Prophylaxis?
Case
A 54-year-old man with end-stage liver disease (ESLD) and no prior history of spontaneous bacterial peritonitis (SBP) presents with increasing shortness of breath and abdominal distention. He is admitted for worsening volume overload. The patient reveals that he has not been compliant with his diuretics. On the day of admission, a large-volume paracentesis is performed. Results are significant for a white blood cell count of 150 cells/mm3 and a total protein of 0.9 g/ul. The patient is started on furosemide and spironolactone, and his symptoms significantly improve throughout his hospitalization. His medications are reconciled on the day of discharge. He is not on any antibiotics for SBP prophylaxis; should he be? In general, which patients with ascites should receive SBP prophylaxis?
Overview
Spontaneous bacterial peritonitis is an infection of ascitic fluid that occurs in the absence of an indentified intra-abdominal source of infection or inflammation, i.e., perforation or abscess.1 It is diagnosed when the polymorphonuclear cell (PMN) count in the ascitic fluid is equal to or greater than 250 cells/mm3, with or without positive cultures.
SBP is a significant cause of morbidity and mortality in patients with cirrhosis, with the mortality rate approaching 20% to 40%.2 Of the 32% to 34% of cirrhotic patients who present with, or develop, a bacterial infection during their hospitalization, 25% are due to SBP.1 Changes in gut motility, mucosal defense, and microflora allow for translocation of bacteria into enteric lymph nodes and the bloodstream, resulting in seeding of the peritoneal fluid and SBP.1 Alterations in both systemic and localized immune defenses, both of which are reduced in patients with liver disease, also play a role in SBP pathogenesis (see Table 1, p. 41).
Current evidence supports the use of a third-generation cephalosporin or amoxicillin/clavulanate for initial treatment of SBP, as most infections are caused by gram-negative bacilli, in particular E. coli (see Table 2 on p. 41 and Table 3 on p. 42).1 Alternatively, an oral or intravenous fluoroquinolone could be used if the prevalence of fluoroquinolone-resistant organisms is low.1
Due to the frequency and morbidity associated with SBP, there is great interest in preventing it. However, the use of prophylactic antibiotics needs to be restricted to patients who are at highest risk of developing SBP. According to numerous studies, patients at high risk for SBP include:
- Patients with a prior SBP history;
- Patients admitted with a gastrointestinal bleed; and
- Patients with low total protein content in their ascitic fluid (defined as <1.5 g/ul).1
SBP History
Spontaneous bacterial peritonitis portends bad outcomes. The one-year mortality rate after an episode of SBP is 30% to 50%.1 Furthermore, patients who have recovered from a previous episode of SBP have a 70% chance of developing another episode of SBP in that year.1,2 In one study, norfloxacin was shown to decrease the one-year risk of SBP to 20% from 68% in patients with a history of SBP.3 Additionally, the likelihood of developing SBP from gram-negative bacilli was reduced to 3% from 60%. In order to be efficacious, norfloxacin must be given daily. When fluoroquinolones are prescribed less than once daily, there is a higher rate of fluoroquinolone resistant organisms in the stool.1
Though once-daily dosing of norfloxacin is recommended to decrease the promotion of resistant organisms in prophylaxis against SBP, ciprofloxacin once weekly is acceptable. In a group of patients with low ascitic protein content, with or without a history of SBP, weekly ciprofloxacin has been shown to decrease SBP incidence to 4% from 22% at six months.4 In regard to length of treatment, recommendations are to continue prophylactic antibiotics until resolution of ascites, the patient receives a transplant, or the patient passes away.1
Saab et al studied the impact of oral antibiotic prophylaxis in patients with advanced liver disease on morbidity and mortality.5 The authors examined prospective, randomized, controlled trials that compared high-risk cirrhotic patients receiving oral antibiotic prophylaxis for SBP with groups receiving placebo or no intervention. Eight studies totaling 647 patients were included in the analysis.
The overall mortality rate for patients treated with SBP prophylaxis was 16%, compared with 25% for the control group. Groups treated with prophylactic antibiotics also had a lower incidence of all infections (6.2% vs. 22.2% in the control groups). Additionally, a survival benefit was seen at three months in the group that received prophylactic antibiotics.
The absolute risk reduction with prophylactic antibiotics for primary prevention of SBP was 8% with a number needed to treat of 13. The incidence of gastrointestinal (GI) bleeding, renal failure, and hepatic failure did not significantly differ between treatment and control groups. Thus, survival benefit is thought to be related to the reduced incidence of infections in the group receiving prophylactic antibiotics.5
History of GI Bleeding
The incidence of developing SBP in cirrhotics with an active GI bleed is anywhere from 20% to 45%.1,2 For those with ascites of any etiology and a GI bleed, the incidence can be as high as 60%.5 In general, bacterial infections are frequently diagnosed in patients with cirrhosis and GI bleeding, and have been documented in 22% of these patients within the first 48 hours after admission. According to several studies, that percentage can reach as high as 35% to 66% within seven to 14 days of admission.6 A seven-day course of antibiotics, or antibiotics until discharge, is generally acceptable for SBP prophylaxis in the setting of ascites and GI bleeding (see Table 2, right).1
Bernard et al performed a meta-analysis of five trials to assess the efficacy of antibiotic prophylaxis in the prevention of infections and effect on survival in patients with cirrhosis and GI bleeding. Out of 534 patients, 264 were treated with antibiotics between four and 10 days, and 270 did not receive any antibiotics.
The endpoints of the study were infection, bacteremia and/or SBP; incidence of SBP; and death. Antibiotic prophylaxis not only increased the mean survival rate by 9.1%, but also increased the mean percentage of patients free of infection (32% improvement); bacteremia and/or SBP (19% improvement); and SBP (7% improvement).7
Low Ascitic Fluid Protein
Of the three major risk factors for SBP, ascitic fluid protein content is the most debated. Guarner et al studied the risk of first community-acquired SBP in cirrhotics with low ascitic fluid protein.2 Patients were seen immediately after discharge from the hospital and at two- to three-month intervals. Of the 109 hospitalized patients, 23 (21%) developed SBP, nine of which developed SBP during their hospitalization. The one-year cumulative probability of SBP in these patients with low ascitic fluid protein levels was 35%.
During this study, the authors also looked at 20 different patient variables on admission and found that two parameters—high bilirubin (>3.2mg/dL) and low platelet count (<98,000 cells/ul)—were associated with an increased risk of SBP. This is consistent with studies showing that patients with higher Model for End-Stage Liver Disease (MELD) or Child-Pugh scores, indicating more severe liver disease, are at increased risk for SBP. This likely is the reason SBP prophylaxis is recommended for patients with an elevated bilirubin, and higher Child-Pugh scores, by the American Association for the Study of Liver Disease (see Table 2, p. 41).
Runyon et al showed that 15% of patients with low ascitic fluid protein developed SBP during their hospitalization, as compared with 2% of patients with ascitic fluid levels greater than 1 g/dl.8 A randomized, non-placebo-controlled trial by Navasa et al evaluating 70 cirrhotic patients with low ascitic ascitic protein levels showed a lower probability of developing SBP in the group placed on SBP prophylaxis with norfloxacin (5% vs. 31%).9 Six-month mortality rate was also lower (19% vs. 36%).
In contrast to the previous studies, Grothe et al found that the presence of SBP was not related to ascitic protein content.10 Given conflicting studies, controversy still remains on whether patients with low ascitic protein should receive long-term prophylactic antibiotics.
Antibiotic Drawbacks
The consensus in the literature is that patients with ascites who are admitted with a GI bleed, or those with a history of SBP, should be placed on SBP prophylaxis. However, patients placed on long-term antibiotics are at risk for developing bacterial resistance. Bacterial resistance in cultures taken from cirrhotic patients with SBP has increased over the last decade, particularly in gram-negative bacteria.5 Patients who receive antibiotics in the pre-transplant setting also are at risk for post-transplant fungal infections.
Additionally, the antibiotic of choice for SBP prophylaxis is typically a fluoroquinolone, which can be expensive. However, numerous studies have shown that the cost of initiating prophylactic therapy for SBP in patients with a prior episode of SBP can be cheaper than treating SBP after diagnosis.2
Back to the Case
Our patient’s paracentesis was negative for SBP. Additionally, he does not have a history of SBP, nor does he have an active GI bleed. His only possible indication for SBP prophylaxis is low ascitic protein concentration. His electrolytes were all within normal limits. Additionally, total bilirubin was only slightly elevated at 2.3 mg/dL.
Based on the American Association for the Study of Liver Diseases guidelines, the patient was not started on SBP prophylaxis. Additionally, given his history of medication noncompliance, there is concern that he might not take the antibiotics as prescribed, thus leading to the development of bacterial resistance and more serious infections in the future.
Bottom Line
Patients with ascites and a prior episode of SBP, and those admitted to the hospital for GI bleeding, should be placed on SBP prophylaxis. SBP prophylaxis for low protein ascitic fluid remains controversial but is recommended by the American Association for the Study of Liver Diseases. TH
Dr. del Pino Jones is a hospitalist at the University of Colorado Denver.
References
- Ghassemi S, Garcia-Tsao G. Prevention and treatment of infections in patients with cirrhosis. Best Pract Res Clin Gastroenterol. 2007;21(1):77-93.
- Guarner C, Solà R, Soriono G, et al. Risk of a first community-acquired spontaneous bacterial peritonitis in cirrhotics with low ascitic fluid protein levels. Gastroenterology. 1999;117(2):414-419.
- Ginés P, Rimola A, Planas R, et al. Norfloxacin prevents spontaneous bacterial peritonitis recurrence in cirrhosis: results of a double-blind, placebo-controlled trial. Hepatology. 1990;12(4 Pt 1):716-724.
- Rolachon A, Cordier L, Bacq Y, et al. Ciprofloxacin and long-term prevention of spontaneous bacterial peritonitis: results of a prospective controlled trial. Hepatology. 1995;22(4 Pt 1):1171-1174.
- Saab S, Hernandez J, Chi AC, Tong MJ. Oral antibiotic prophylaxis reduces spontaneous bacterial peritonitis occurrence and improves short-term survival in cirrhosis: a meta-analysis. Am J Gastroenterol. 2009;104(4):993-1001.
- Deschênes M, Villeneuve J. Risk factors for the development of bacterial infections in hospitalized patients with cirrhosis. Am J Gastroenterol. 1999;94(8):2193-2197.
- Bernard B, Grangé J, Khac EN, Amiot X, Opolon P, Poynard T. Antibiotic prophylaxis for the prevention of bacterial infections in cirrhotic patients with gastrointestinal bleeding: a meta-analysis. Hepatology. 1999;29(6):1655-1661.
- Runyon B. Low-protein-concentration ascitic fluid is predisposed to spontaneous bacterial peritonitis. Gastroenterology. 1986;91(6):1343-1346.
- Navasa M, Fernandez J, Montoliu S, et al. Randomized, double-blind, placebo-controlled trial evaluating norfloxacin in the primary prophylaxis of spontaneous bacterial peritonitis in cirrhotics with renal impairment, hyponatremia or severe liver failure. J Hepatol. 2006;44(Supp2):S51.
- Grothe W, Lottere E, Fleig W. Factors predictive for spontaneous bacterial peritonitis (SBP) under routine inpatient conditions in patients with cirrhosis: a prospective multicenter trial. J Hepatol. 1990;34(4):547.
Case
A 54-year-old man with end-stage liver disease (ESLD) and no prior history of spontaneous bacterial peritonitis (SBP) presents with increasing shortness of breath and abdominal distention. He is admitted for worsening volume overload. The patient reveals that he has not been compliant with his diuretics. On the day of admission, a large-volume paracentesis is performed. Results are significant for a white blood cell count of 150 cells/mm3 and a total protein of 0.9 g/ul. The patient is started on furosemide and spironolactone, and his symptoms significantly improve throughout his hospitalization. His medications are reconciled on the day of discharge. He is not on any antibiotics for SBP prophylaxis; should he be? In general, which patients with ascites should receive SBP prophylaxis?
Overview
Spontaneous bacterial peritonitis is an infection of ascitic fluid that occurs in the absence of an indentified intra-abdominal source of infection or inflammation, i.e., perforation or abscess.1 It is diagnosed when the polymorphonuclear cell (PMN) count in the ascitic fluid is equal to or greater than 250 cells/mm3, with or without positive cultures.
SBP is a significant cause of morbidity and mortality in patients with cirrhosis, with the mortality rate approaching 20% to 40%.2 Of the 32% to 34% of cirrhotic patients who present with, or develop, a bacterial infection during their hospitalization, 25% are due to SBP.1 Changes in gut motility, mucosal defense, and microflora allow for translocation of bacteria into enteric lymph nodes and the bloodstream, resulting in seeding of the peritoneal fluid and SBP.1 Alterations in both systemic and localized immune defenses, both of which are reduced in patients with liver disease, also play a role in SBP pathogenesis (see Table 1, p. 41).
Current evidence supports the use of a third-generation cephalosporin or amoxicillin/clavulanate for initial treatment of SBP, as most infections are caused by gram-negative bacilli, in particular E. coli (see Table 2 on p. 41 and Table 3 on p. 42).1 Alternatively, an oral or intravenous fluoroquinolone could be used if the prevalence of fluoroquinolone-resistant organisms is low.1
Due to the frequency and morbidity associated with SBP, there is great interest in preventing it. However, the use of prophylactic antibiotics needs to be restricted to patients who are at highest risk of developing SBP. According to numerous studies, patients at high risk for SBP include:
- Patients with a prior SBP history;
- Patients admitted with a gastrointestinal bleed; and
- Patients with low total protein content in their ascitic fluid (defined as <1.5 g/ul).1
SBP History
Spontaneous bacterial peritonitis portends bad outcomes. The one-year mortality rate after an episode of SBP is 30% to 50%.1 Furthermore, patients who have recovered from a previous episode of SBP have a 70% chance of developing another episode of SBP in that year.1,2 In one study, norfloxacin was shown to decrease the one-year risk of SBP to 20% from 68% in patients with a history of SBP.3 Additionally, the likelihood of developing SBP from gram-negative bacilli was reduced to 3% from 60%. In order to be efficacious, norfloxacin must be given daily. When fluoroquinolones are prescribed less than once daily, there is a higher rate of fluoroquinolone resistant organisms in the stool.1
Though once-daily dosing of norfloxacin is recommended to decrease the promotion of resistant organisms in prophylaxis against SBP, ciprofloxacin once weekly is acceptable. In a group of patients with low ascitic protein content, with or without a history of SBP, weekly ciprofloxacin has been shown to decrease SBP incidence to 4% from 22% at six months.4 In regard to length of treatment, recommendations are to continue prophylactic antibiotics until resolution of ascites, the patient receives a transplant, or the patient passes away.1
Saab et al studied the impact of oral antibiotic prophylaxis in patients with advanced liver disease on morbidity and mortality.5 The authors examined prospective, randomized, controlled trials that compared high-risk cirrhotic patients receiving oral antibiotic prophylaxis for SBP with groups receiving placebo or no intervention. Eight studies totaling 647 patients were included in the analysis.
The overall mortality rate for patients treated with SBP prophylaxis was 16%, compared with 25% for the control group. Groups treated with prophylactic antibiotics also had a lower incidence of all infections (6.2% vs. 22.2% in the control groups). Additionally, a survival benefit was seen at three months in the group that received prophylactic antibiotics.
The absolute risk reduction with prophylactic antibiotics for primary prevention of SBP was 8% with a number needed to treat of 13. The incidence of gastrointestinal (GI) bleeding, renal failure, and hepatic failure did not significantly differ between treatment and control groups. Thus, survival benefit is thought to be related to the reduced incidence of infections in the group receiving prophylactic antibiotics.5
History of GI Bleeding
The incidence of developing SBP in cirrhotics with an active GI bleed is anywhere from 20% to 45%.1,2 For those with ascites of any etiology and a GI bleed, the incidence can be as high as 60%.5 In general, bacterial infections are frequently diagnosed in patients with cirrhosis and GI bleeding, and have been documented in 22% of these patients within the first 48 hours after admission. According to several studies, that percentage can reach as high as 35% to 66% within seven to 14 days of admission.6 A seven-day course of antibiotics, or antibiotics until discharge, is generally acceptable for SBP prophylaxis in the setting of ascites and GI bleeding (see Table 2, right).1
Bernard et al performed a meta-analysis of five trials to assess the efficacy of antibiotic prophylaxis in the prevention of infections and effect on survival in patients with cirrhosis and GI bleeding. Out of 534 patients, 264 were treated with antibiotics between four and 10 days, and 270 did not receive any antibiotics.
The endpoints of the study were infection, bacteremia and/or SBP; incidence of SBP; and death. Antibiotic prophylaxis not only increased the mean survival rate by 9.1%, but also increased the mean percentage of patients free of infection (32% improvement); bacteremia and/or SBP (19% improvement); and SBP (7% improvement).7
Low Ascitic Fluid Protein
Of the three major risk factors for SBP, ascitic fluid protein content is the most debated. Guarner et al studied the risk of first community-acquired SBP in cirrhotics with low ascitic fluid protein.2 Patients were seen immediately after discharge from the hospital and at two- to three-month intervals. Of the 109 hospitalized patients, 23 (21%) developed SBP, nine of which developed SBP during their hospitalization. The one-year cumulative probability of SBP in these patients with low ascitic fluid protein levels was 35%.
During this study, the authors also looked at 20 different patient variables on admission and found that two parameters—high bilirubin (>3.2mg/dL) and low platelet count (<98,000 cells/ul)—were associated with an increased risk of SBP. This is consistent with studies showing that patients with higher Model for End-Stage Liver Disease (MELD) or Child-Pugh scores, indicating more severe liver disease, are at increased risk for SBP. This likely is the reason SBP prophylaxis is recommended for patients with an elevated bilirubin, and higher Child-Pugh scores, by the American Association for the Study of Liver Disease (see Table 2, p. 41).
Runyon et al showed that 15% of patients with low ascitic fluid protein developed SBP during their hospitalization, as compared with 2% of patients with ascitic fluid levels greater than 1 g/dl.8 A randomized, non-placebo-controlled trial by Navasa et al evaluating 70 cirrhotic patients with low ascitic ascitic protein levels showed a lower probability of developing SBP in the group placed on SBP prophylaxis with norfloxacin (5% vs. 31%).9 Six-month mortality rate was also lower (19% vs. 36%).
In contrast to the previous studies, Grothe et al found that the presence of SBP was not related to ascitic protein content.10 Given conflicting studies, controversy still remains on whether patients with low ascitic protein should receive long-term prophylactic antibiotics.
Antibiotic Drawbacks
The consensus in the literature is that patients with ascites who are admitted with a GI bleed, or those with a history of SBP, should be placed on SBP prophylaxis. However, patients placed on long-term antibiotics are at risk for developing bacterial resistance. Bacterial resistance in cultures taken from cirrhotic patients with SBP has increased over the last decade, particularly in gram-negative bacteria.5 Patients who receive antibiotics in the pre-transplant setting also are at risk for post-transplant fungal infections.
Additionally, the antibiotic of choice for SBP prophylaxis is typically a fluoroquinolone, which can be expensive. However, numerous studies have shown that the cost of initiating prophylactic therapy for SBP in patients with a prior episode of SBP can be cheaper than treating SBP after diagnosis.2
Back to the Case
Our patient’s paracentesis was negative for SBP. Additionally, he does not have a history of SBP, nor does he have an active GI bleed. His only possible indication for SBP prophylaxis is low ascitic protein concentration. His electrolytes were all within normal limits. Additionally, total bilirubin was only slightly elevated at 2.3 mg/dL.
Based on the American Association for the Study of Liver Diseases guidelines, the patient was not started on SBP prophylaxis. Additionally, given his history of medication noncompliance, there is concern that he might not take the antibiotics as prescribed, thus leading to the development of bacterial resistance and more serious infections in the future.
Bottom Line
Patients with ascites and a prior episode of SBP, and those admitted to the hospital for GI bleeding, should be placed on SBP prophylaxis. SBP prophylaxis for low protein ascitic fluid remains controversial but is recommended by the American Association for the Study of Liver Diseases. TH
Dr. del Pino Jones is a hospitalist at the University of Colorado Denver.
References
- Ghassemi S, Garcia-Tsao G. Prevention and treatment of infections in patients with cirrhosis. Best Pract Res Clin Gastroenterol. 2007;21(1):77-93.
- Guarner C, Solà R, Soriono G, et al. Risk of a first community-acquired spontaneous bacterial peritonitis in cirrhotics with low ascitic fluid protein levels. Gastroenterology. 1999;117(2):414-419.
- Ginés P, Rimola A, Planas R, et al. Norfloxacin prevents spontaneous bacterial peritonitis recurrence in cirrhosis: results of a double-blind, placebo-controlled trial. Hepatology. 1990;12(4 Pt 1):716-724.
- Rolachon A, Cordier L, Bacq Y, et al. Ciprofloxacin and long-term prevention of spontaneous bacterial peritonitis: results of a prospective controlled trial. Hepatology. 1995;22(4 Pt 1):1171-1174.
- Saab S, Hernandez J, Chi AC, Tong MJ. Oral antibiotic prophylaxis reduces spontaneous bacterial peritonitis occurrence and improves short-term survival in cirrhosis: a meta-analysis. Am J Gastroenterol. 2009;104(4):993-1001.
- Deschênes M, Villeneuve J. Risk factors for the development of bacterial infections in hospitalized patients with cirrhosis. Am J Gastroenterol. 1999;94(8):2193-2197.
- Bernard B, Grangé J, Khac EN, Amiot X, Opolon P, Poynard T. Antibiotic prophylaxis for the prevention of bacterial infections in cirrhotic patients with gastrointestinal bleeding: a meta-analysis. Hepatology. 1999;29(6):1655-1661.
- Runyon B. Low-protein-concentration ascitic fluid is predisposed to spontaneous bacterial peritonitis. Gastroenterology. 1986;91(6):1343-1346.
- Navasa M, Fernandez J, Montoliu S, et al. Randomized, double-blind, placebo-controlled trial evaluating norfloxacin in the primary prophylaxis of spontaneous bacterial peritonitis in cirrhotics with renal impairment, hyponatremia or severe liver failure. J Hepatol. 2006;44(Supp2):S51.
- Grothe W, Lottere E, Fleig W. Factors predictive for spontaneous bacterial peritonitis (SBP) under routine inpatient conditions in patients with cirrhosis: a prospective multicenter trial. J Hepatol. 1990;34(4):547.
Case
A 54-year-old man with end-stage liver disease (ESLD) and no prior history of spontaneous bacterial peritonitis (SBP) presents with increasing shortness of breath and abdominal distention. He is admitted for worsening volume overload. The patient reveals that he has not been compliant with his diuretics. On the day of admission, a large-volume paracentesis is performed. Results are significant for a white blood cell count of 150 cells/mm3 and a total protein of 0.9 g/ul. The patient is started on furosemide and spironolactone, and his symptoms significantly improve throughout his hospitalization. His medications are reconciled on the day of discharge. He is not on any antibiotics for SBP prophylaxis; should he be? In general, which patients with ascites should receive SBP prophylaxis?
Overview
Spontaneous bacterial peritonitis is an infection of ascitic fluid that occurs in the absence of an indentified intra-abdominal source of infection or inflammation, i.e., perforation or abscess.1 It is diagnosed when the polymorphonuclear cell (PMN) count in the ascitic fluid is equal to or greater than 250 cells/mm3, with or without positive cultures.
SBP is a significant cause of morbidity and mortality in patients with cirrhosis, with the mortality rate approaching 20% to 40%.2 Of the 32% to 34% of cirrhotic patients who present with, or develop, a bacterial infection during their hospitalization, 25% are due to SBP.1 Changes in gut motility, mucosal defense, and microflora allow for translocation of bacteria into enteric lymph nodes and the bloodstream, resulting in seeding of the peritoneal fluid and SBP.1 Alterations in both systemic and localized immune defenses, both of which are reduced in patients with liver disease, also play a role in SBP pathogenesis (see Table 1, p. 41).
Current evidence supports the use of a third-generation cephalosporin or amoxicillin/clavulanate for initial treatment of SBP, as most infections are caused by gram-negative bacilli, in particular E. coli (see Table 2 on p. 41 and Table 3 on p. 42).1 Alternatively, an oral or intravenous fluoroquinolone could be used if the prevalence of fluoroquinolone-resistant organisms is low.1
Due to the frequency and morbidity associated with SBP, there is great interest in preventing it. However, the use of prophylactic antibiotics needs to be restricted to patients who are at highest risk of developing SBP. According to numerous studies, patients at high risk for SBP include:
- Patients with a prior SBP history;
- Patients admitted with a gastrointestinal bleed; and
- Patients with low total protein content in their ascitic fluid (defined as <1.5 g/ul).1
SBP History
Spontaneous bacterial peritonitis portends bad outcomes. The one-year mortality rate after an episode of SBP is 30% to 50%.1 Furthermore, patients who have recovered from a previous episode of SBP have a 70% chance of developing another episode of SBP in that year.1,2 In one study, norfloxacin was shown to decrease the one-year risk of SBP to 20% from 68% in patients with a history of SBP.3 Additionally, the likelihood of developing SBP from gram-negative bacilli was reduced to 3% from 60%. In order to be efficacious, norfloxacin must be given daily. When fluoroquinolones are prescribed less than once daily, there is a higher rate of fluoroquinolone resistant organisms in the stool.1
Though once-daily dosing of norfloxacin is recommended to decrease the promotion of resistant organisms in prophylaxis against SBP, ciprofloxacin once weekly is acceptable. In a group of patients with low ascitic protein content, with or without a history of SBP, weekly ciprofloxacin has been shown to decrease SBP incidence to 4% from 22% at six months.4 In regard to length of treatment, recommendations are to continue prophylactic antibiotics until resolution of ascites, the patient receives a transplant, or the patient passes away.1
Saab et al studied the impact of oral antibiotic prophylaxis in patients with advanced liver disease on morbidity and mortality.5 The authors examined prospective, randomized, controlled trials that compared high-risk cirrhotic patients receiving oral antibiotic prophylaxis for SBP with groups receiving placebo or no intervention. Eight studies totaling 647 patients were included in the analysis.
The overall mortality rate for patients treated with SBP prophylaxis was 16%, compared with 25% for the control group. Groups treated with prophylactic antibiotics also had a lower incidence of all infections (6.2% vs. 22.2% in the control groups). Additionally, a survival benefit was seen at three months in the group that received prophylactic antibiotics.
The absolute risk reduction with prophylactic antibiotics for primary prevention of SBP was 8% with a number needed to treat of 13. The incidence of gastrointestinal (GI) bleeding, renal failure, and hepatic failure did not significantly differ between treatment and control groups. Thus, survival benefit is thought to be related to the reduced incidence of infections in the group receiving prophylactic antibiotics.5
History of GI Bleeding
The incidence of developing SBP in cirrhotics with an active GI bleed is anywhere from 20% to 45%.1,2 For those with ascites of any etiology and a GI bleed, the incidence can be as high as 60%.5 In general, bacterial infections are frequently diagnosed in patients with cirrhosis and GI bleeding, and have been documented in 22% of these patients within the first 48 hours after admission. According to several studies, that percentage can reach as high as 35% to 66% within seven to 14 days of admission.6 A seven-day course of antibiotics, or antibiotics until discharge, is generally acceptable for SBP prophylaxis in the setting of ascites and GI bleeding (see Table 2, right).1
Bernard et al performed a meta-analysis of five trials to assess the efficacy of antibiotic prophylaxis in the prevention of infections and effect on survival in patients with cirrhosis and GI bleeding. Out of 534 patients, 264 were treated with antibiotics between four and 10 days, and 270 did not receive any antibiotics.
The endpoints of the study were infection, bacteremia and/or SBP; incidence of SBP; and death. Antibiotic prophylaxis not only increased the mean survival rate by 9.1%, but also increased the mean percentage of patients free of infection (32% improvement); bacteremia and/or SBP (19% improvement); and SBP (7% improvement).7
Low Ascitic Fluid Protein
Of the three major risk factors for SBP, ascitic fluid protein content is the most debated. Guarner et al studied the risk of first community-acquired SBP in cirrhotics with low ascitic fluid protein.2 Patients were seen immediately after discharge from the hospital and at two- to three-month intervals. Of the 109 hospitalized patients, 23 (21%) developed SBP, nine of which developed SBP during their hospitalization. The one-year cumulative probability of SBP in these patients with low ascitic fluid protein levels was 35%.
During this study, the authors also looked at 20 different patient variables on admission and found that two parameters—high bilirubin (>3.2mg/dL) and low platelet count (<98,000 cells/ul)—were associated with an increased risk of SBP. This is consistent with studies showing that patients with higher Model for End-Stage Liver Disease (MELD) or Child-Pugh scores, indicating more severe liver disease, are at increased risk for SBP. This likely is the reason SBP prophylaxis is recommended for patients with an elevated bilirubin, and higher Child-Pugh scores, by the American Association for the Study of Liver Disease (see Table 2, p. 41).
Runyon et al showed that 15% of patients with low ascitic fluid protein developed SBP during their hospitalization, as compared with 2% of patients with ascitic fluid levels greater than 1 g/dl.8 A randomized, non-placebo-controlled trial by Navasa et al evaluating 70 cirrhotic patients with low ascitic ascitic protein levels showed a lower probability of developing SBP in the group placed on SBP prophylaxis with norfloxacin (5% vs. 31%).9 Six-month mortality rate was also lower (19% vs. 36%).
In contrast to the previous studies, Grothe et al found that the presence of SBP was not related to ascitic protein content.10 Given conflicting studies, controversy still remains on whether patients with low ascitic protein should receive long-term prophylactic antibiotics.
Antibiotic Drawbacks
The consensus in the literature is that patients with ascites who are admitted with a GI bleed, or those with a history of SBP, should be placed on SBP prophylaxis. However, patients placed on long-term antibiotics are at risk for developing bacterial resistance. Bacterial resistance in cultures taken from cirrhotic patients with SBP has increased over the last decade, particularly in gram-negative bacteria.5 Patients who receive antibiotics in the pre-transplant setting also are at risk for post-transplant fungal infections.
Additionally, the antibiotic of choice for SBP prophylaxis is typically a fluoroquinolone, which can be expensive. However, numerous studies have shown that the cost of initiating prophylactic therapy for SBP in patients with a prior episode of SBP can be cheaper than treating SBP after diagnosis.2
Back to the Case
Our patient’s paracentesis was negative for SBP. Additionally, he does not have a history of SBP, nor does he have an active GI bleed. His only possible indication for SBP prophylaxis is low ascitic protein concentration. His electrolytes were all within normal limits. Additionally, total bilirubin was only slightly elevated at 2.3 mg/dL.
Based on the American Association for the Study of Liver Diseases guidelines, the patient was not started on SBP prophylaxis. Additionally, given his history of medication noncompliance, there is concern that he might not take the antibiotics as prescribed, thus leading to the development of bacterial resistance and more serious infections in the future.
Bottom Line
Patients with ascites and a prior episode of SBP, and those admitted to the hospital for GI bleeding, should be placed on SBP prophylaxis. SBP prophylaxis for low protein ascitic fluid remains controversial but is recommended by the American Association for the Study of Liver Diseases. TH
Dr. del Pino Jones is a hospitalist at the University of Colorado Denver.
References
- Ghassemi S, Garcia-Tsao G. Prevention and treatment of infections in patients with cirrhosis. Best Pract Res Clin Gastroenterol. 2007;21(1):77-93.
- Guarner C, Solà R, Soriono G, et al. Risk of a first community-acquired spontaneous bacterial peritonitis in cirrhotics with low ascitic fluid protein levels. Gastroenterology. 1999;117(2):414-419.
- Ginés P, Rimola A, Planas R, et al. Norfloxacin prevents spontaneous bacterial peritonitis recurrence in cirrhosis: results of a double-blind, placebo-controlled trial. Hepatology. 1990;12(4 Pt 1):716-724.
- Rolachon A, Cordier L, Bacq Y, et al. Ciprofloxacin and long-term prevention of spontaneous bacterial peritonitis: results of a prospective controlled trial. Hepatology. 1995;22(4 Pt 1):1171-1174.
- Saab S, Hernandez J, Chi AC, Tong MJ. Oral antibiotic prophylaxis reduces spontaneous bacterial peritonitis occurrence and improves short-term survival in cirrhosis: a meta-analysis. Am J Gastroenterol. 2009;104(4):993-1001.
- Deschênes M, Villeneuve J. Risk factors for the development of bacterial infections in hospitalized patients with cirrhosis. Am J Gastroenterol. 1999;94(8):2193-2197.
- Bernard B, Grangé J, Khac EN, Amiot X, Opolon P, Poynard T. Antibiotic prophylaxis for the prevention of bacterial infections in cirrhotic patients with gastrointestinal bleeding: a meta-analysis. Hepatology. 1999;29(6):1655-1661.
- Runyon B. Low-protein-concentration ascitic fluid is predisposed to spontaneous bacterial peritonitis. Gastroenterology. 1986;91(6):1343-1346.
- Navasa M, Fernandez J, Montoliu S, et al. Randomized, double-blind, placebo-controlled trial evaluating norfloxacin in the primary prophylaxis of spontaneous bacterial peritonitis in cirrhotics with renal impairment, hyponatremia or severe liver failure. J Hepatol. 2006;44(Supp2):S51.
- Grothe W, Lottere E, Fleig W. Factors predictive for spontaneous bacterial peritonitis (SBP) under routine inpatient conditions in patients with cirrhosis: a prospective multicenter trial. J Hepatol. 1990;34(4):547.
What Is the Role of BNP in Diagnosis and Management of Acutely Decompensated Heart Failure?
Case
A 76-year-old woman with a history of chronic obstructive pulmonary disease (COPD), congestive heart failure (CHF), and atrial fibrillation presents with shortness of breath. She is tachypneic, her pulse is 105 beats per minute, and her blood pressure is 105/60 mm/Hg. She is obese and has an immeasurable venous pressure with decreased breath sounds in both lung bases, and irregular and distant heart sounds. What is the role of brain (or B-type) natriuretic peptide (BNP) in the diagnosis and management of this patient?
Overview
Each year, more than 1 million patients are admitted to hospitals with acutely decompensated heart failure (ADHF). Although many of these patients carry a pre-admission diagnosis of CHF, their common presenting symptoms are not specific for ADHF, which leads to delays in diagnosis and therapy initiation, and increased diagnostic costs and potentially worse outcomes. Clinical risk scores from NHANES and the Framingham heart study have limited sensitivity, missing nearly 20% of patients.1,2 Moreover, these scores are underused by clinicians who depend heavily on clinical gestalt.3
Once ADHF is diagnosed, ongoing bedside assessment of volume status is a difficult and inexact science. The physiologic goal is achievement of normal left ventricular end diastolic volume; however, surrogate measures of this status, including weight change, venous pressure, and pulmonary and cardiac auscultatory findings, have significant limitations. After discharge, patients have high and heterogeneous risks of readmission, death, and other adverse events. Identifying patients with the highest risk might allow for intensive strategies to improve outcomes.
BNP is a neurohormone released from the ventricular cells in response to increased cardiac filling pressures. Plasma measurements of BNP have been shown to reflect volume status, to predict risk at admission and discharge, and to serve as a treatment guide in a variety of clinical settings.4 This simple laboratory test increasingly has been used to diagnose and manage ADHF; its utility and limitations deserve critical review.
Review of the Data
CHF diagnosis. Since introduction of the rapid BNP assay, several trials have evaluated its clinical utility in determining whether ADHF is the cause of a patient’s dyspnea. The largest of these trials, the Breathing Not Properly Multinational Study, conducted by McCullough et al, enrolled nearly 1,600 patients who presented with the primary complaint of dyspnea.5 After reviewing conventional clinical information, ED physicians were asked to determine the likelihood that ADHF was the etiology of a patient’s dyspnea. These likelihoods were classified as low (<20%), intermediate (20%-80%), or high (>80%). The admission BNP was recorded but was not available for the ED physician decisions.
The “gold standard” was the opinion of two adjudicating cardiologists who reviewed the cases retrospectively and determined whether the dyspnea resulted from ADHF. They were blinded to both the ED physician’s opinion and the BNP results. The accuracy of the ED physician’s initial assessment and the impact of the BNP results were compared with this gold standard.
For the entire cohort, the use of BNP (with a cutoff point of 100 pg/mL) would have improved the ED physician’s assessment from 74% diagnostic accuracy to 81%, which is statistically significant. Most important, in those patients initially given an intermediate likelihood of CHF, BNP results correctly classified 75% of these patients and rarely missed ADHF cases (<10%).
Atrial fibrillation. Since the original trials that established a BNP cutoff of 100 pg/mL for determining the presence of ADHF, several adjustments have been suggested. The presence of atrial fibrillation has been shown to increase BNP values independent of cardiac filling pressures. Breidthardt et al examined patients with atrial fibrillation presenting with dyspnea.4 In their analysis, using a cutoff of 100 pg/mL remained robust in identifying patients without ADHF. However, in the 100 pg/mL-500 pg/mL range, the test was not able to discriminate between atrial fibrillation and ADHF. Values greater than 500 pg/mL proved accurate in supporting the diagnosis of ADHF.
Renal failure. Renal dysfunction also elevates BNP levels independent of filling pressures. McCullough et al re-examined data from their Breathing Not Properly Multinational Study and found that the glomerular filtration rate (GFR) was inversely related to BNP levels.5 They recommend using a cutoff point of 200 pg/mL when the GFR is below 60 mg/dL. Other authors recommend not using BNP levels to diagnose ADHF when the GFR is less than 60 mg/dL due to the lack of data supporting this approach. Until clarified, clinicians should be cautious of interpreting BNP elevations in the setting of kidney disease.
Obesity. Obesity has a negative effect on BNP levels, decreasing the sensitivity of the test in these patients.6 Although no study defines how to adjust for body mass index (BMI), clinicians should be cautious about using a low BNP to rule out ADHF in a dyspneic obese patient.
Historical BNP values. If historical BNP values are available, studies of biological variation have shown that an increase to 123% from 66% from baseline is representative of a clinically meaningful increase in cardiac filling pressures. Less significant changes could merely represent biological variation and should be cautiously interpreted.7
Cost effectiveness. The cost effectiveness of using BNP measurements in dyspneic ED patients has been examined as well. Mueller et al found in a Swiss hospital that BNP testing was associated with a 25% decrease in treatment cost, length of stay (LOS), and ICU usage.8 However, LOS is significantly longer in Switzerland compared with the U.S., and given that much of the cost savings was attributed to reducing LOS, it is not possible to extrapolate these data to the U.S. health system. More evidence is needed to truly evaluate the cost effectiveness of BNP testing.
Serial BNP testing. Once a patient has been diagnosed with ADHF and admitted to the hospital, diuretics are indicated with the goal of achieving euvolemia. The bedside assessment of volume status remains a difficult and inexact science, and failure to appropriately remove fluid is associated with readmissions. Conversely, overdiuresis with a concomitant rise in creatinine has been associated with increased morbidity and mortality.
Several studies have shown that the reduction of volume associated with diuretic administration is coupled with a rapid decrease in BNP levels. Therefore, serial BNP measurement has been evaluated as a tool to guide the daily assessment of volume status in patients admitted with ADHF. Unfortunately, frequent measurements of BNP reveal that a great deal of variance, or “noise,” is present in these repeat measurements. Data do not clearly show how to incorporate serial BNP measurements into daily diuretic management.9
Mortality prediction. Nearly 3.5% of admitted heart failure patients will die during their hospitalization. For perspective, the rate of hospital mortality with acute myocardial infarction is 7%. BNP serves as a powerful and independent predictor of inpatient mortality. The ADHERE (Acute Decompensated Heart Failure National Registry) study showed that when divided into BNP quartiles of <430 pg/mL, 430 pg/mL to 839 pg/mL, 840 pg/mL to 1,729 pg/mL, and >1,730 pg/mL, patients’ risk of inpatient death was accurately predicted as 1.9%, 2.8%, 3.8%, and 6.0%, respectively.10 Even when adjusted for other risk factors, BNP remained a powerful predictor; the mortality rate more than doubled from the lowest to highest quartile.
Different strategies have been proposed to improve the outcomes in these highest-risk patients; however, to date, no evidence-based strategy offers a meaningful way to reduce inpatient mortality beyond the current standard of care.
Readmission and 30-day mortality. The 30-day readmission rate after discharge for ADHF is more than than 25%. A study of Medicare patients showed that more than $17 billion (more than 15% of all Medicare payments to hospitals) was associated with unplanned rehospitalizations.11 As bundling payment trends develop, hospitals have an enormous incentive to identify CHF patients with the highest risk of readmission and attempt to mitigate that risk.
From a patient-centered view, upon hospital discharge a patient with ADHF also realizes a 1 in 10 chance of dying within the first 30 days.
At discharge, BNP serves as a powerful and independent marker of increased risk of readmission, morbidity, and mortality. O’Connor et al developed a discharge risk model in patients with severe left ventricular dysfunction; the ESCAPE risk model and discharge score showed elevated BNP was the single most powerful predictor of six-month mortality.12 For every doubling of the BNP, the odds of death at six months increased by 1.4 times.
After combining discharge BNP with other factors, the ESCAPE discharge score was fairly successful at discriminating between patients who would and would not survive to six months. By identifying these outpatients, intensive management strategies could be focused on individuals with the highest risk. The data support the idea that readmission reductions are significant when outpatients obtain early follow-up. Many healthcare centers struggle to schedule early follow-up for all heart failure patients.
As such, the ability to target individuals with the highest discharge scores for intensive follow-up might improve outcomes. These patients could undergo early evaluation for such advanced therapies as resynchronization, left ventricular assist device implantation, or listing for transplantation. Currently, this strategy is not proven. It also is possible that these high-risk patients might have such advanced diseases that their risk cannot be modified by our current medications and advanced therapies.
Back to the Case
This patient has symptoms and signs that could be caused by ADHF or COPD. Her presentation is consistent with an intermediate probability of ADHF. A rapid BNP reveals a level of 950 pg/mL.
Even considering the higher cutoff required because of her coexistent atrial fibrillation, her BNP is consistent with ADHF. Additionally, her obesity likely has decreased the true value of her BNP. A previous BNP drawn when the patient was not in ADHF was 250 ng/mL, meaning that at least a 70% increase is present.
She was admitted and treated with intravenous diuretics with improvement in her congestion and relief of her symptoms. Daily BNPs were not drawn and her diuretics were titrated based on bedside clinical assessments. Her admission BNP elevation would predict a moderately high risk of short- and intermediate term of morbidity and mortality.
At discharge, a repeat BNP also could add to her risk stratification, though it would not be clear what do with this prognostic information beyond the standard of care.
Bottom Line
BNP measurement in specific situations can complement conventional clinical information in determining the presence of ADHF and also can enhance clinicians’ ability to risk-stratify patients during and after hospitalization. TH
Dr. Wolfe is a hospitalist and assistant professor of medicine at the University of Colorado Denver.
References
- Schocken DD, Arrieta MI, Leaverton PE, Ross EA. Prevalence and mortality of congestive heart failure in the United States. J Am Coll Cardiol. 1992;20(2):301-306.
- McKee PA, Castelli WP, McNamara PM, Kannel WB. The natural history of congestive heart failure: the Framingham study. N Eng J Med. 1971;285(26):1441-1446.
- Wang CS, FitzGerald JM, Schulzer M, Mak E, Ayas NT. Does this dyspneic patient in the emergency department have congestive heart failure? JAMA. 2005;294(15):1944-1956.
- Breidthardt T, Noveanu M, Cayir S, et al. The use of B-type natriuretic peptide in the management of patients with atrial fibrillation and dyspnea. Int J Cardiol. 2009;136(2):193-199.
- McCullough PA, Duc P, Omland T, et al. B-type natriuretic peptide and renal function in the diagnosis of heart failure: an analysis from the Breathing Not Properly Multinational Study. Am J Kidney Dis. 2003;41(3):571-579.
- Iwanaga Y, Hihara Y, Nizuma S, et al. BNP in overweight and obese patients with heart failure: an analysis based on the BNP-LV diastolic wall stress relationship. J Card Fail. 2007;13(8):663-667.
- O’Hanlon R, O’Shea P, Ledwidge M. The biologic variability of B-type natriuretic peptide and N-terminal pro-B-type natriuretic peptide in stable heart failure patients. J Card Fail. 2007;13(1):50-55.
- Mueller C, Laule-Kilian K, Schindler C, et al. Cost-effectiveness of B-type natriuretic peptide testing in patients with acute dyspnea. Arch Intern Med. 2006;166(1):1081-1087.
- Wu AH. Serial testing of B-type natriuretic peptide and NTpro-BNP for monitoring therapy of heart failure: the role of biologic variation in the interpretation of results. Am Heart J. 2006;152(5):828-834.
- Fonarow GC, Peacock WF, Phillips CO, et al. ADHERE Scientific Advisory Committee and Investigators. Admission B-type natriuretic peptide levels and in-hospital mortality in acute decompensated heart failure. J Am Coll Cardiol. 2007;48 (19):1943-1950.
- Jencks SF, Williams MC, Coleman EA. Rehospitalizations among patients in the Medicare fee-for-service program. N Engl J Med. 2009;360(14):1418-1428.
- O’Connor CM, Hasselblad V, Mehta RH, et al. Triage after hospitalization with advanced heart failure: the ESCAPE (Evaluation Study of Congestive Heart Failure and Pulmonary Artery Catheterization Effectiveness) risk model and discharge score. J Am Coll Cardiol. 2010;55(9):872-878.
Case
A 76-year-old woman with a history of chronic obstructive pulmonary disease (COPD), congestive heart failure (CHF), and atrial fibrillation presents with shortness of breath. She is tachypneic, her pulse is 105 beats per minute, and her blood pressure is 105/60 mm/Hg. She is obese and has an immeasurable venous pressure with decreased breath sounds in both lung bases, and irregular and distant heart sounds. What is the role of brain (or B-type) natriuretic peptide (BNP) in the diagnosis and management of this patient?
Overview
Each year, more than 1 million patients are admitted to hospitals with acutely decompensated heart failure (ADHF). Although many of these patients carry a pre-admission diagnosis of CHF, their common presenting symptoms are not specific for ADHF, which leads to delays in diagnosis and therapy initiation, and increased diagnostic costs and potentially worse outcomes. Clinical risk scores from NHANES and the Framingham heart study have limited sensitivity, missing nearly 20% of patients.1,2 Moreover, these scores are underused by clinicians who depend heavily on clinical gestalt.3
Once ADHF is diagnosed, ongoing bedside assessment of volume status is a difficult and inexact science. The physiologic goal is achievement of normal left ventricular end diastolic volume; however, surrogate measures of this status, including weight change, venous pressure, and pulmonary and cardiac auscultatory findings, have significant limitations. After discharge, patients have high and heterogeneous risks of readmission, death, and other adverse events. Identifying patients with the highest risk might allow for intensive strategies to improve outcomes.
BNP is a neurohormone released from the ventricular cells in response to increased cardiac filling pressures. Plasma measurements of BNP have been shown to reflect volume status, to predict risk at admission and discharge, and to serve as a treatment guide in a variety of clinical settings.4 This simple laboratory test increasingly has been used to diagnose and manage ADHF; its utility and limitations deserve critical review.
Review of the Data
CHF diagnosis. Since introduction of the rapid BNP assay, several trials have evaluated its clinical utility in determining whether ADHF is the cause of a patient’s dyspnea. The largest of these trials, the Breathing Not Properly Multinational Study, conducted by McCullough et al, enrolled nearly 1,600 patients who presented with the primary complaint of dyspnea.5 After reviewing conventional clinical information, ED physicians were asked to determine the likelihood that ADHF was the etiology of a patient’s dyspnea. These likelihoods were classified as low (<20%), intermediate (20%-80%), or high (>80%). The admission BNP was recorded but was not available for the ED physician decisions.
The “gold standard” was the opinion of two adjudicating cardiologists who reviewed the cases retrospectively and determined whether the dyspnea resulted from ADHF. They were blinded to both the ED physician’s opinion and the BNP results. The accuracy of the ED physician’s initial assessment and the impact of the BNP results were compared with this gold standard.
For the entire cohort, the use of BNP (with a cutoff point of 100 pg/mL) would have improved the ED physician’s assessment from 74% diagnostic accuracy to 81%, which is statistically significant. Most important, in those patients initially given an intermediate likelihood of CHF, BNP results correctly classified 75% of these patients and rarely missed ADHF cases (<10%).
Atrial fibrillation. Since the original trials that established a BNP cutoff of 100 pg/mL for determining the presence of ADHF, several adjustments have been suggested. The presence of atrial fibrillation has been shown to increase BNP values independent of cardiac filling pressures. Breidthardt et al examined patients with atrial fibrillation presenting with dyspnea.4 In their analysis, using a cutoff of 100 pg/mL remained robust in identifying patients without ADHF. However, in the 100 pg/mL-500 pg/mL range, the test was not able to discriminate between atrial fibrillation and ADHF. Values greater than 500 pg/mL proved accurate in supporting the diagnosis of ADHF.
Renal failure. Renal dysfunction also elevates BNP levels independent of filling pressures. McCullough et al re-examined data from their Breathing Not Properly Multinational Study and found that the glomerular filtration rate (GFR) was inversely related to BNP levels.5 They recommend using a cutoff point of 200 pg/mL when the GFR is below 60 mg/dL. Other authors recommend not using BNP levels to diagnose ADHF when the GFR is less than 60 mg/dL due to the lack of data supporting this approach. Until clarified, clinicians should be cautious of interpreting BNP elevations in the setting of kidney disease.
Obesity. Obesity has a negative effect on BNP levels, decreasing the sensitivity of the test in these patients.6 Although no study defines how to adjust for body mass index (BMI), clinicians should be cautious about using a low BNP to rule out ADHF in a dyspneic obese patient.
Historical BNP values. If historical BNP values are available, studies of biological variation have shown that an increase to 123% from 66% from baseline is representative of a clinically meaningful increase in cardiac filling pressures. Less significant changes could merely represent biological variation and should be cautiously interpreted.7
Cost effectiveness. The cost effectiveness of using BNP measurements in dyspneic ED patients has been examined as well. Mueller et al found in a Swiss hospital that BNP testing was associated with a 25% decrease in treatment cost, length of stay (LOS), and ICU usage.8 However, LOS is significantly longer in Switzerland compared with the U.S., and given that much of the cost savings was attributed to reducing LOS, it is not possible to extrapolate these data to the U.S. health system. More evidence is needed to truly evaluate the cost effectiveness of BNP testing.
Serial BNP testing. Once a patient has been diagnosed with ADHF and admitted to the hospital, diuretics are indicated with the goal of achieving euvolemia. The bedside assessment of volume status remains a difficult and inexact science, and failure to appropriately remove fluid is associated with readmissions. Conversely, overdiuresis with a concomitant rise in creatinine has been associated with increased morbidity and mortality.
Several studies have shown that the reduction of volume associated with diuretic administration is coupled with a rapid decrease in BNP levels. Therefore, serial BNP measurement has been evaluated as a tool to guide the daily assessment of volume status in patients admitted with ADHF. Unfortunately, frequent measurements of BNP reveal that a great deal of variance, or “noise,” is present in these repeat measurements. Data do not clearly show how to incorporate serial BNP measurements into daily diuretic management.9
Mortality prediction. Nearly 3.5% of admitted heart failure patients will die during their hospitalization. For perspective, the rate of hospital mortality with acute myocardial infarction is 7%. BNP serves as a powerful and independent predictor of inpatient mortality. The ADHERE (Acute Decompensated Heart Failure National Registry) study showed that when divided into BNP quartiles of <430 pg/mL, 430 pg/mL to 839 pg/mL, 840 pg/mL to 1,729 pg/mL, and >1,730 pg/mL, patients’ risk of inpatient death was accurately predicted as 1.9%, 2.8%, 3.8%, and 6.0%, respectively.10 Even when adjusted for other risk factors, BNP remained a powerful predictor; the mortality rate more than doubled from the lowest to highest quartile.
Different strategies have been proposed to improve the outcomes in these highest-risk patients; however, to date, no evidence-based strategy offers a meaningful way to reduce inpatient mortality beyond the current standard of care.
Readmission and 30-day mortality. The 30-day readmission rate after discharge for ADHF is more than than 25%. A study of Medicare patients showed that more than $17 billion (more than 15% of all Medicare payments to hospitals) was associated with unplanned rehospitalizations.11 As bundling payment trends develop, hospitals have an enormous incentive to identify CHF patients with the highest risk of readmission and attempt to mitigate that risk.
From a patient-centered view, upon hospital discharge a patient with ADHF also realizes a 1 in 10 chance of dying within the first 30 days.
At discharge, BNP serves as a powerful and independent marker of increased risk of readmission, morbidity, and mortality. O’Connor et al developed a discharge risk model in patients with severe left ventricular dysfunction; the ESCAPE risk model and discharge score showed elevated BNP was the single most powerful predictor of six-month mortality.12 For every doubling of the BNP, the odds of death at six months increased by 1.4 times.
After combining discharge BNP with other factors, the ESCAPE discharge score was fairly successful at discriminating between patients who would and would not survive to six months. By identifying these outpatients, intensive management strategies could be focused on individuals with the highest risk. The data support the idea that readmission reductions are significant when outpatients obtain early follow-up. Many healthcare centers struggle to schedule early follow-up for all heart failure patients.
As such, the ability to target individuals with the highest discharge scores for intensive follow-up might improve outcomes. These patients could undergo early evaluation for such advanced therapies as resynchronization, left ventricular assist device implantation, or listing for transplantation. Currently, this strategy is not proven. It also is possible that these high-risk patients might have such advanced diseases that their risk cannot be modified by our current medications and advanced therapies.
Back to the Case
This patient has symptoms and signs that could be caused by ADHF or COPD. Her presentation is consistent with an intermediate probability of ADHF. A rapid BNP reveals a level of 950 pg/mL.
Even considering the higher cutoff required because of her coexistent atrial fibrillation, her BNP is consistent with ADHF. Additionally, her obesity likely has decreased the true value of her BNP. A previous BNP drawn when the patient was not in ADHF was 250 ng/mL, meaning that at least a 70% increase is present.
She was admitted and treated with intravenous diuretics with improvement in her congestion and relief of her symptoms. Daily BNPs were not drawn and her diuretics were titrated based on bedside clinical assessments. Her admission BNP elevation would predict a moderately high risk of short- and intermediate term of morbidity and mortality.
At discharge, a repeat BNP also could add to her risk stratification, though it would not be clear what do with this prognostic information beyond the standard of care.
Bottom Line
BNP measurement in specific situations can complement conventional clinical information in determining the presence of ADHF and also can enhance clinicians’ ability to risk-stratify patients during and after hospitalization. TH
Dr. Wolfe is a hospitalist and assistant professor of medicine at the University of Colorado Denver.
References
- Schocken DD, Arrieta MI, Leaverton PE, Ross EA. Prevalence and mortality of congestive heart failure in the United States. J Am Coll Cardiol. 1992;20(2):301-306.
- McKee PA, Castelli WP, McNamara PM, Kannel WB. The natural history of congestive heart failure: the Framingham study. N Eng J Med. 1971;285(26):1441-1446.
- Wang CS, FitzGerald JM, Schulzer M, Mak E, Ayas NT. Does this dyspneic patient in the emergency department have congestive heart failure? JAMA. 2005;294(15):1944-1956.
- Breidthardt T, Noveanu M, Cayir S, et al. The use of B-type natriuretic peptide in the management of patients with atrial fibrillation and dyspnea. Int J Cardiol. 2009;136(2):193-199.
- McCullough PA, Duc P, Omland T, et al. B-type natriuretic peptide and renal function in the diagnosis of heart failure: an analysis from the Breathing Not Properly Multinational Study. Am J Kidney Dis. 2003;41(3):571-579.
- Iwanaga Y, Hihara Y, Nizuma S, et al. BNP in overweight and obese patients with heart failure: an analysis based on the BNP-LV diastolic wall stress relationship. J Card Fail. 2007;13(8):663-667.
- O’Hanlon R, O’Shea P, Ledwidge M. The biologic variability of B-type natriuretic peptide and N-terminal pro-B-type natriuretic peptide in stable heart failure patients. J Card Fail. 2007;13(1):50-55.
- Mueller C, Laule-Kilian K, Schindler C, et al. Cost-effectiveness of B-type natriuretic peptide testing in patients with acute dyspnea. Arch Intern Med. 2006;166(1):1081-1087.
- Wu AH. Serial testing of B-type natriuretic peptide and NTpro-BNP for monitoring therapy of heart failure: the role of biologic variation in the interpretation of results. Am Heart J. 2006;152(5):828-834.
- Fonarow GC, Peacock WF, Phillips CO, et al. ADHERE Scientific Advisory Committee and Investigators. Admission B-type natriuretic peptide levels and in-hospital mortality in acute decompensated heart failure. J Am Coll Cardiol. 2007;48 (19):1943-1950.
- Jencks SF, Williams MC, Coleman EA. Rehospitalizations among patients in the Medicare fee-for-service program. N Engl J Med. 2009;360(14):1418-1428.
- O’Connor CM, Hasselblad V, Mehta RH, et al. Triage after hospitalization with advanced heart failure: the ESCAPE (Evaluation Study of Congestive Heart Failure and Pulmonary Artery Catheterization Effectiveness) risk model and discharge score. J Am Coll Cardiol. 2010;55(9):872-878.
Case
A 76-year-old woman with a history of chronic obstructive pulmonary disease (COPD), congestive heart failure (CHF), and atrial fibrillation presents with shortness of breath. She is tachypneic, her pulse is 105 beats per minute, and her blood pressure is 105/60 mm/Hg. She is obese and has an immeasurable venous pressure with decreased breath sounds in both lung bases, and irregular and distant heart sounds. What is the role of brain (or B-type) natriuretic peptide (BNP) in the diagnosis and management of this patient?
Overview
Each year, more than 1 million patients are admitted to hospitals with acutely decompensated heart failure (ADHF). Although many of these patients carry a pre-admission diagnosis of CHF, their common presenting symptoms are not specific for ADHF, which leads to delays in diagnosis and therapy initiation, and increased diagnostic costs and potentially worse outcomes. Clinical risk scores from NHANES and the Framingham heart study have limited sensitivity, missing nearly 20% of patients.1,2 Moreover, these scores are underused by clinicians who depend heavily on clinical gestalt.3
Once ADHF is diagnosed, ongoing bedside assessment of volume status is a difficult and inexact science. The physiologic goal is achievement of normal left ventricular end diastolic volume; however, surrogate measures of this status, including weight change, venous pressure, and pulmonary and cardiac auscultatory findings, have significant limitations. After discharge, patients have high and heterogeneous risks of readmission, death, and other adverse events. Identifying patients with the highest risk might allow for intensive strategies to improve outcomes.
BNP is a neurohormone released from the ventricular cells in response to increased cardiac filling pressures. Plasma measurements of BNP have been shown to reflect volume status, to predict risk at admission and discharge, and to serve as a treatment guide in a variety of clinical settings.4 This simple laboratory test increasingly has been used to diagnose and manage ADHF; its utility and limitations deserve critical review.
Review of the Data
CHF diagnosis. Since introduction of the rapid BNP assay, several trials have evaluated its clinical utility in determining whether ADHF is the cause of a patient’s dyspnea. The largest of these trials, the Breathing Not Properly Multinational Study, conducted by McCullough et al, enrolled nearly 1,600 patients who presented with the primary complaint of dyspnea.5 After reviewing conventional clinical information, ED physicians were asked to determine the likelihood that ADHF was the etiology of a patient’s dyspnea. These likelihoods were classified as low (<20%), intermediate (20%-80%), or high (>80%). The admission BNP was recorded but was not available for the ED physician decisions.
The “gold standard” was the opinion of two adjudicating cardiologists who reviewed the cases retrospectively and determined whether the dyspnea resulted from ADHF. They were blinded to both the ED physician’s opinion and the BNP results. The accuracy of the ED physician’s initial assessment and the impact of the BNP results were compared with this gold standard.
For the entire cohort, the use of BNP (with a cutoff point of 100 pg/mL) would have improved the ED physician’s assessment from 74% diagnostic accuracy to 81%, which is statistically significant. Most important, in those patients initially given an intermediate likelihood of CHF, BNP results correctly classified 75% of these patients and rarely missed ADHF cases (<10%).
Atrial fibrillation. Since the original trials that established a BNP cutoff of 100 pg/mL for determining the presence of ADHF, several adjustments have been suggested. The presence of atrial fibrillation has been shown to increase BNP values independent of cardiac filling pressures. Breidthardt et al examined patients with atrial fibrillation presenting with dyspnea.4 In their analysis, using a cutoff of 100 pg/mL remained robust in identifying patients without ADHF. However, in the 100 pg/mL-500 pg/mL range, the test was not able to discriminate between atrial fibrillation and ADHF. Values greater than 500 pg/mL proved accurate in supporting the diagnosis of ADHF.
Renal failure. Renal dysfunction also elevates BNP levels independent of filling pressures. McCullough et al re-examined data from their Breathing Not Properly Multinational Study and found that the glomerular filtration rate (GFR) was inversely related to BNP levels.5 They recommend using a cutoff point of 200 pg/mL when the GFR is below 60 mg/dL. Other authors recommend not using BNP levels to diagnose ADHF when the GFR is less than 60 mg/dL due to the lack of data supporting this approach. Until clarified, clinicians should be cautious of interpreting BNP elevations in the setting of kidney disease.
Obesity. Obesity has a negative effect on BNP levels, decreasing the sensitivity of the test in these patients.6 Although no study defines how to adjust for body mass index (BMI), clinicians should be cautious about using a low BNP to rule out ADHF in a dyspneic obese patient.
Historical BNP values. If historical BNP values are available, studies of biological variation have shown that an increase to 123% from 66% from baseline is representative of a clinically meaningful increase in cardiac filling pressures. Less significant changes could merely represent biological variation and should be cautiously interpreted.7
Cost effectiveness. The cost effectiveness of using BNP measurements in dyspneic ED patients has been examined as well. Mueller et al found in a Swiss hospital that BNP testing was associated with a 25% decrease in treatment cost, length of stay (LOS), and ICU usage.8 However, LOS is significantly longer in Switzerland compared with the U.S., and given that much of the cost savings was attributed to reducing LOS, it is not possible to extrapolate these data to the U.S. health system. More evidence is needed to truly evaluate the cost effectiveness of BNP testing.
Serial BNP testing. Once a patient has been diagnosed with ADHF and admitted to the hospital, diuretics are indicated with the goal of achieving euvolemia. The bedside assessment of volume status remains a difficult and inexact science, and failure to appropriately remove fluid is associated with readmissions. Conversely, overdiuresis with a concomitant rise in creatinine has been associated with increased morbidity and mortality.
Several studies have shown that the reduction of volume associated with diuretic administration is coupled with a rapid decrease in BNP levels. Therefore, serial BNP measurement has been evaluated as a tool to guide the daily assessment of volume status in patients admitted with ADHF. Unfortunately, frequent measurements of BNP reveal that a great deal of variance, or “noise,” is present in these repeat measurements. Data do not clearly show how to incorporate serial BNP measurements into daily diuretic management.9
Mortality prediction. Nearly 3.5% of admitted heart failure patients will die during their hospitalization. For perspective, the rate of hospital mortality with acute myocardial infarction is 7%. BNP serves as a powerful and independent predictor of inpatient mortality. The ADHERE (Acute Decompensated Heart Failure National Registry) study showed that when divided into BNP quartiles of <430 pg/mL, 430 pg/mL to 839 pg/mL, 840 pg/mL to 1,729 pg/mL, and >1,730 pg/mL, patients’ risk of inpatient death was accurately predicted as 1.9%, 2.8%, 3.8%, and 6.0%, respectively.10 Even when adjusted for other risk factors, BNP remained a powerful predictor; the mortality rate more than doubled from the lowest to highest quartile.
Different strategies have been proposed to improve the outcomes in these highest-risk patients; however, to date, no evidence-based strategy offers a meaningful way to reduce inpatient mortality beyond the current standard of care.
Readmission and 30-day mortality. The 30-day readmission rate after discharge for ADHF is more than than 25%. A study of Medicare patients showed that more than $17 billion (more than 15% of all Medicare payments to hospitals) was associated with unplanned rehospitalizations.11 As bundling payment trends develop, hospitals have an enormous incentive to identify CHF patients with the highest risk of readmission and attempt to mitigate that risk.
From a patient-centered view, upon hospital discharge a patient with ADHF also realizes a 1 in 10 chance of dying within the first 30 days.
At discharge, BNP serves as a powerful and independent marker of increased risk of readmission, morbidity, and mortality. O’Connor et al developed a discharge risk model in patients with severe left ventricular dysfunction; the ESCAPE risk model and discharge score showed elevated BNP was the single most powerful predictor of six-month mortality.12 For every doubling of the BNP, the odds of death at six months increased by 1.4 times.
After combining discharge BNP with other factors, the ESCAPE discharge score was fairly successful at discriminating between patients who would and would not survive to six months. By identifying these outpatients, intensive management strategies could be focused on individuals with the highest risk. The data support the idea that readmission reductions are significant when outpatients obtain early follow-up. Many healthcare centers struggle to schedule early follow-up for all heart failure patients.
As such, the ability to target individuals with the highest discharge scores for intensive follow-up might improve outcomes. These patients could undergo early evaluation for such advanced therapies as resynchronization, left ventricular assist device implantation, or listing for transplantation. Currently, this strategy is not proven. It also is possible that these high-risk patients might have such advanced diseases that their risk cannot be modified by our current medications and advanced therapies.
Back to the Case
This patient has symptoms and signs that could be caused by ADHF or COPD. Her presentation is consistent with an intermediate probability of ADHF. A rapid BNP reveals a level of 950 pg/mL.
Even considering the higher cutoff required because of her coexistent atrial fibrillation, her BNP is consistent with ADHF. Additionally, her obesity likely has decreased the true value of her BNP. A previous BNP drawn when the patient was not in ADHF was 250 ng/mL, meaning that at least a 70% increase is present.
She was admitted and treated with intravenous diuretics with improvement in her congestion and relief of her symptoms. Daily BNPs were not drawn and her diuretics were titrated based on bedside clinical assessments. Her admission BNP elevation would predict a moderately high risk of short- and intermediate term of morbidity and mortality.
At discharge, a repeat BNP also could add to her risk stratification, though it would not be clear what do with this prognostic information beyond the standard of care.
Bottom Line
BNP measurement in specific situations can complement conventional clinical information in determining the presence of ADHF and also can enhance clinicians’ ability to risk-stratify patients during and after hospitalization. TH
Dr. Wolfe is a hospitalist and assistant professor of medicine at the University of Colorado Denver.
References
- Schocken DD, Arrieta MI, Leaverton PE, Ross EA. Prevalence and mortality of congestive heart failure in the United States. J Am Coll Cardiol. 1992;20(2):301-306.
- McKee PA, Castelli WP, McNamara PM, Kannel WB. The natural history of congestive heart failure: the Framingham study. N Eng J Med. 1971;285(26):1441-1446.
- Wang CS, FitzGerald JM, Schulzer M, Mak E, Ayas NT. Does this dyspneic patient in the emergency department have congestive heart failure? JAMA. 2005;294(15):1944-1956.
- Breidthardt T, Noveanu M, Cayir S, et al. The use of B-type natriuretic peptide in the management of patients with atrial fibrillation and dyspnea. Int J Cardiol. 2009;136(2):193-199.
- McCullough PA, Duc P, Omland T, et al. B-type natriuretic peptide and renal function in the diagnosis of heart failure: an analysis from the Breathing Not Properly Multinational Study. Am J Kidney Dis. 2003;41(3):571-579.
- Iwanaga Y, Hihara Y, Nizuma S, et al. BNP in overweight and obese patients with heart failure: an analysis based on the BNP-LV diastolic wall stress relationship. J Card Fail. 2007;13(8):663-667.
- O’Hanlon R, O’Shea P, Ledwidge M. The biologic variability of B-type natriuretic peptide and N-terminal pro-B-type natriuretic peptide in stable heart failure patients. J Card Fail. 2007;13(1):50-55.
- Mueller C, Laule-Kilian K, Schindler C, et al. Cost-effectiveness of B-type natriuretic peptide testing in patients with acute dyspnea. Arch Intern Med. 2006;166(1):1081-1087.
- Wu AH. Serial testing of B-type natriuretic peptide and NTpro-BNP for monitoring therapy of heart failure: the role of biologic variation in the interpretation of results. Am Heart J. 2006;152(5):828-834.
- Fonarow GC, Peacock WF, Phillips CO, et al. ADHERE Scientific Advisory Committee and Investigators. Admission B-type natriuretic peptide levels and in-hospital mortality in acute decompensated heart failure. J Am Coll Cardiol. 2007;48 (19):1943-1950.
- Jencks SF, Williams MC, Coleman EA. Rehospitalizations among patients in the Medicare fee-for-service program. N Engl J Med. 2009;360(14):1418-1428.
- O’Connor CM, Hasselblad V, Mehta RH, et al. Triage after hospitalization with advanced heart failure: the ESCAPE (Evaluation Study of Congestive Heart Failure and Pulmonary Artery Catheterization Effectiveness) risk model and discharge score. J Am Coll Cardiol. 2010;55(9):872-878.
What Should I Do If I Get a Needlestick?
Case
While placing a central line, you sustain a needlestick. You’ve washed the area thoroughly with soap and water, but you are concerned about contracting a bloodborne pathogen. What is the risk of contracting such a pathogen, and what can be done to reduce this risk?
Overview
Needlestick injuries are a common occupational hazard in the hospital setting. According to the International Health Care Worker Safety Center (IHCWSC), approximately 295,000 hospital-based healthcare workers experience occupational percutaneous injuries annually. In 1991, Mangione et al surveyed internal-medicine house staff and found an annual incidence of 674 needlestick injuries per 1,000 participants.1 Other retrospective data estimate this risk to be as high as 839 per 1,000 healthcare workers annually.2 Evidence from the Centers for Disease Control and Prevention (CDC) in 2004 suggests that because these are only self-reported injuries, the annual incidence of such injuries is in fact much higher than the current estimates suggest.2,3,4
More than 20 bloodborne pathogens (see Table 1, right) might be transmitted from contaminated needles or sharps, including human immunodeficiency virus (HIV), hepatitis B virus (HBV), and hepatitis C virus (HCV). A quick and appropriate response to a needlestick injury can greatly decrease the risk of disease transmission following an occupational exposure to potentially infectious materials.
Review of the Data
After any needlestick injury, an affected healthcare worker should wash the area with soap and water immediately. There is no contraindication to using antiseptic solutions, but there is also no evidence to suggest that this reduces the rates of disease transmission.
As decisions for post-exposure prophylaxis often need to be made within hours, a healthcare worker should seek care in the facility areas responsible for managing occupational exposures. Healthcare providers should always be encouraged and supported to report all sharps-related injuries to such departments.
The source patient should be identified and evaluated for potentially transmissible diseases, including HIV, HBV, and HCV. If indicated, the source patient should then undergo appropriate serological testing, and any indicated antiviral prophylaxis should be initiated (see Table 2, p. 19).
Risk of Seroconversion
For all bloodborne pathogens, a needlestick injury carries a greater risk for transmission than other occupational exposures (e.g. mucous membrane exposure). If a needlestick injury occurs in the setting of an infected patient source, the risk of disease transmission varies for HIV, HBV, and HCV (see Table 3, p. 19). In general, risk for seroconversion is increased with a deep injury, an injury with a device visibly contaminated with the source patient’s blood, or an injury involving a needle placed in the source patient’s artery or vein.3,5,6
Human immunodeficiency virus. Contracting HIV after needlestick injury is rare. From 1981 to 2006, the CDC documented only 57 cases of HIV/AIDS in healthcare workers following occupational exposure and identified an additional “possible” 140 cases post-exposure.5,6 Of the 57 documented cases, 48 sustained a percutaneous injury.
Following needlestick injury involving a known HIV-positive source, the one-year risk of seroconversion has been estimated to be 0.3%.5,6 In 1997, Cardo and colleagues identified four factors associated with increased risk for seroconversion after a needlestick/sharps injury from a known positive-HIV source:
- Deep injury;
- Injury with a device visibly contaminated with the source patient’s blood;
- A procedure involving a needle placed in the source patient’s artery or vein; and
- Exposure to a source patient who died of AIDS in the two months following the occupational exposure.5
Hepatitis B virus. Wides-pread immunization of healthcare workers has led to a dramatic decline in occupationally acquired HBV. The CDC estimated that in 1985, approximately 12,500 new HBV infections occurred in healthcare workers.3 This estimate plummeted to approximately 500 new occupationally acquired HBV infections in 1997.3
Despite this, hospital-based healthcare personnel remain at risk for HBV transmission after a needlestick injury from a known positive patient source. Few studies have evaluated the occupational risk of HBV transmission after a needlestick injury. Buergler et al reported that following a needlestick injury involving a known HBV-positive source, the one-year risk of seroconversion was 0.76% to 7.35% for nonimmunized surgeons, and 0.23% to 2.28% for nonimmunized anesthesiologists.7
In the absence of post-exposure prophylaxis, an exposed healthcare worker has a 6% to 30% risk of becoming infected with HBV.3,8 The risk is greatest if the patient source is known to be hepatitis B e antigen-positive, a marker for greater disease infectivity. When given within one week of injury, post-exposure prophylaxis (PEP) with multiple doses of hepatitis B immune globulin (HBIG) provides an estimated 75% protection from transmission.
Healthcare workers who have received the hepatitis B vaccine and developed immunity have virtually no risk for infection.6,7
Hepatitis C virus. Prospective evaluation has demonstrated that the average risk of HCV transmission after percutaneous exposure to a known HCV-positive source is from 0% to 7%.3 The Italian Study Group on Occupational Risk of HIV and Other Bloodborne Infections evaluated HCV seroconversion within six months of a reported exposure with enzyme immunoassay and immunoblot assay. In this study, the authors found a seroconversion rate of 1.2%.9
Further, they suggested that HCV seroconversion only occurred from hollow-bore needles, as no seroconversions were noted in healthcare workers who sustained injuries with solid sharp objects.
Post-Exposure Management
The CDC does not recommend prophylaxis when source fluids make contact with intact skin. However, if a percutaneous occupational exposure has occurred, PEPs exist for HIV and HBV but not for HCV.3,6 If a source patient’s HIV, HBV, and HCV statuses are unknown, occupational-health personnel can interview the patient to evaluate his or her risks and initiate testing. Specific information about the time and nature of exposure should be documented.
When testing is indicated, it should be done following institutional and state-specific exposure-control policies and informed consent guidelines. In all situations, the decision to begin antiviral PEP should be carefully considered, weighing benefits of PEP versus the risks and toxicity of treatment.
Human immunodeficiency virus. If a source patient is known to be HIV-positive, has a positive rapid HIV test, or if HIV status cannot be quickly determined, PEP is indicated. Healthcare providers should be aware of rare cases in which the source patient initially tested HIV-seronegative but was subsequently found to have primary HIV infection.
Per 2004 CDC recommendations, PEP is indicated for all healthcare workers who sustain a percuanteous injury from a known HIV-positive source.3,8 For a less severe injury (e.g. solid needle or superficial injury), PEP with either a basic two-drug or three-drug regimen is indicated, depending on the source patient’s viral load.3,5,6,8
If the source patient has unknown HIV status, two-drug PEP is indicated based on the source patient’s HIV risk factors. In such patients, rapid HIV testing also is indicated to aid in determining the need for PEP. When the source HIV status is unknown, PEP is indicated in settings where exposure to HIV-infected persons is likely.
If PEP is indicated, it should be started as quickly as possible. The 2005 U.S. Public Health Service Recommendations for PEP recommend initiating two nucleosides for low-risk exposures and two nucleosides plus a boosted protease inhibitor for high-risk exposures.
Examples of commonly used dual nucleoside regimens are Zidovudine plus Lamivudine (coformulated as Combivir) or Tenofovir plus Emtricitabine (coformulated as Truvada). Current recommendations indicate that PEP should be continued for four weeks, with concurrent clinical and laboratory evaluation for drug toxicity.
Hepatitis B virus. Numerous prospective studies have evaluated the post-exposure effectiveness of HBIG. When administered within 24 hours of exposure, HBIG might offer immediate passive protection against HBV infection. Additionally, if initiated within one week of percutaneous injury with a known HBV-positive source, multiple doses of HGIB provide an estimated 75% protection from transmission.
Although the combination of HBIG and the hepatitis vaccine B series has not been evaluated as PEP in the occupational setting, evidence in the perinatal setting suggests this regimen is more effective than HBIG alone.3,6,8
Hepatitis C virus. No PEP exists for HCV, and current recommendations for post-exposure management focus on early identification and treatment of chronic disease. There are insufficient data for a treatment recommendation for patients with acute HCV infection with no evidence of disease; the appropriate dosing of such a regimen is unknown. Further, evidence suggests that treatment started early in the course of chronic infection could be just as effective and might eliminate the need to treat persons whose infection will spontaneously resolve.7
Back to the Case
Your needlestick occurred while using a hollow-bore needle to cannulate a source patient’s vein, placing you at higher risk for seroconversion. You immediately reported the exposure to the department of occupational health at your hospital. The source patient’s HIV, HBV, and HCV serological statuses were tested, and the patient was found to be HBV-positive. After appropriate counseling, you decide to receive HGIB prophylaxis to reduce your chances of becoming infected with HBV infection.
Bottom Line
Healthcare workers who suffer occupational needlestick injuries require immediate identification and attention to avoid transmission of such infectious diseases as HIV, HBV, and HCV. Source patients should undergo rapid serological testing to determine appropriate PEP. TH
Dr. Zehnder is a hospitalist in the Section of Hospital Medicine at the University of Colorado Denver.
References
- Mangione CM, Gerberding JL, Cummings, SR. Occupational exposure to HIV: Frequency and rates of underreporting of percutaneous and mucocutaneous exposures by medical housestaff. Am J Med. 1991;90(1):85-90.
- Lee JM, Botteman MF, Nicklasson L, et al. Needlestick injury in acute care nurses caring for patients with diabetes mellitus: a retrospective study. Curr Med Res Opinion. 2005;21(5):741-747.
- Workbook for designing, implementing, and evaluating a sharps injury prevention program. Centers for Disease Control and Prevention website. Available at: www.cdc.gov/sharpssafety/pdf/WorkbookComplete.pdf. Accessed Sept. 13, 2010.
- Lee JM, Botteman MF, Xanthakos N, Nicklasson L. Needlestick injuries in the United States. Epidemiologic, economic, and quality of life issues. AAOHN J. 2005;53(3):117-133.
- Cardo DM, Culver DH, Ciesielski CA, et al. A case-control study of HIV seroconversion in health care workers after percutaneous exposure. Centers for Disease Control and Prevention Needlestick Surveillance Group. N Engl J Med. 1997;337(21):1485-1490.
- Exposure to blood: What healthcare personnel need to know. CDC website. Available at: www.cdc.gov/ncidod /dhqp/pdf/bbp/Exp_to_Blood.pdf. Accessed Aug. 31, 2010.
- Buergler JM, Kim R, Thisted RA, Cohn SJ, Lichtor JL, Roizen MF. Risk of human immunodeficiency virus in surgeons, anesthesiologists, and medical students. Anesth Analg. 1992;75(1):118-124.
- Updated U.S. Public Health Service guidelines for the management of occupational exposures to HBV, HCV, and HIV and recommendations for postexposure prophylaxis. CDC website. Available at: www.cdc.gov/mmwr/preview/mmwrhtml/rr5011a1.htm. Accessed Aug. 31, 2010.
- Puro V, Petrosillo N, Ippolito G. Risk of hepatitis C seroconversion after occupational exposure in health care workers. Italian Study Group on Occupational Risk of HIV and Other Bloodborne Infections. Am J Infect Control. 1995;23(5):273-277.
Case
While placing a central line, you sustain a needlestick. You’ve washed the area thoroughly with soap and water, but you are concerned about contracting a bloodborne pathogen. What is the risk of contracting such a pathogen, and what can be done to reduce this risk?
Overview
Needlestick injuries are a common occupational hazard in the hospital setting. According to the International Health Care Worker Safety Center (IHCWSC), approximately 295,000 hospital-based healthcare workers experience occupational percutaneous injuries annually. In 1991, Mangione et al surveyed internal-medicine house staff and found an annual incidence of 674 needlestick injuries per 1,000 participants.1 Other retrospective data estimate this risk to be as high as 839 per 1,000 healthcare workers annually.2 Evidence from the Centers for Disease Control and Prevention (CDC) in 2004 suggests that because these are only self-reported injuries, the annual incidence of such injuries is in fact much higher than the current estimates suggest.2,3,4
More than 20 bloodborne pathogens (see Table 1, right) might be transmitted from contaminated needles or sharps, including human immunodeficiency virus (HIV), hepatitis B virus (HBV), and hepatitis C virus (HCV). A quick and appropriate response to a needlestick injury can greatly decrease the risk of disease transmission following an occupational exposure to potentially infectious materials.
Review of the Data
After any needlestick injury, an affected healthcare worker should wash the area with soap and water immediately. There is no contraindication to using antiseptic solutions, but there is also no evidence to suggest that this reduces the rates of disease transmission.
As decisions for post-exposure prophylaxis often need to be made within hours, a healthcare worker should seek care in the facility areas responsible for managing occupational exposures. Healthcare providers should always be encouraged and supported to report all sharps-related injuries to such departments.
The source patient should be identified and evaluated for potentially transmissible diseases, including HIV, HBV, and HCV. If indicated, the source patient should then undergo appropriate serological testing, and any indicated antiviral prophylaxis should be initiated (see Table 2, p. 19).
Risk of Seroconversion
For all bloodborne pathogens, a needlestick injury carries a greater risk for transmission than other occupational exposures (e.g. mucous membrane exposure). If a needlestick injury occurs in the setting of an infected patient source, the risk of disease transmission varies for HIV, HBV, and HCV (see Table 3, p. 19). In general, risk for seroconversion is increased with a deep injury, an injury with a device visibly contaminated with the source patient’s blood, or an injury involving a needle placed in the source patient’s artery or vein.3,5,6
Human immunodeficiency virus. Contracting HIV after needlestick injury is rare. From 1981 to 2006, the CDC documented only 57 cases of HIV/AIDS in healthcare workers following occupational exposure and identified an additional “possible” 140 cases post-exposure.5,6 Of the 57 documented cases, 48 sustained a percutaneous injury.
Following needlestick injury involving a known HIV-positive source, the one-year risk of seroconversion has been estimated to be 0.3%.5,6 In 1997, Cardo and colleagues identified four factors associated with increased risk for seroconversion after a needlestick/sharps injury from a known positive-HIV source:
- Deep injury;
- Injury with a device visibly contaminated with the source patient’s blood;
- A procedure involving a needle placed in the source patient’s artery or vein; and
- Exposure to a source patient who died of AIDS in the two months following the occupational exposure.5
Hepatitis B virus. Wides-pread immunization of healthcare workers has led to a dramatic decline in occupationally acquired HBV. The CDC estimated that in 1985, approximately 12,500 new HBV infections occurred in healthcare workers.3 This estimate plummeted to approximately 500 new occupationally acquired HBV infections in 1997.3
Despite this, hospital-based healthcare personnel remain at risk for HBV transmission after a needlestick injury from a known positive patient source. Few studies have evaluated the occupational risk of HBV transmission after a needlestick injury. Buergler et al reported that following a needlestick injury involving a known HBV-positive source, the one-year risk of seroconversion was 0.76% to 7.35% for nonimmunized surgeons, and 0.23% to 2.28% for nonimmunized anesthesiologists.7
In the absence of post-exposure prophylaxis, an exposed healthcare worker has a 6% to 30% risk of becoming infected with HBV.3,8 The risk is greatest if the patient source is known to be hepatitis B e antigen-positive, a marker for greater disease infectivity. When given within one week of injury, post-exposure prophylaxis (PEP) with multiple doses of hepatitis B immune globulin (HBIG) provides an estimated 75% protection from transmission.
Healthcare workers who have received the hepatitis B vaccine and developed immunity have virtually no risk for infection.6,7
Hepatitis C virus. Prospective evaluation has demonstrated that the average risk of HCV transmission after percutaneous exposure to a known HCV-positive source is from 0% to 7%.3 The Italian Study Group on Occupational Risk of HIV and Other Bloodborne Infections evaluated HCV seroconversion within six months of a reported exposure with enzyme immunoassay and immunoblot assay. In this study, the authors found a seroconversion rate of 1.2%.9
Further, they suggested that HCV seroconversion only occurred from hollow-bore needles, as no seroconversions were noted in healthcare workers who sustained injuries with solid sharp objects.
Post-Exposure Management
The CDC does not recommend prophylaxis when source fluids make contact with intact skin. However, if a percutaneous occupational exposure has occurred, PEPs exist for HIV and HBV but not for HCV.3,6 If a source patient’s HIV, HBV, and HCV statuses are unknown, occupational-health personnel can interview the patient to evaluate his or her risks and initiate testing. Specific information about the time and nature of exposure should be documented.
When testing is indicated, it should be done following institutional and state-specific exposure-control policies and informed consent guidelines. In all situations, the decision to begin antiviral PEP should be carefully considered, weighing benefits of PEP versus the risks and toxicity of treatment.
Human immunodeficiency virus. If a source patient is known to be HIV-positive, has a positive rapid HIV test, or if HIV status cannot be quickly determined, PEP is indicated. Healthcare providers should be aware of rare cases in which the source patient initially tested HIV-seronegative but was subsequently found to have primary HIV infection.
Per 2004 CDC recommendations, PEP is indicated for all healthcare workers who sustain a percuanteous injury from a known HIV-positive source.3,8 For a less severe injury (e.g. solid needle or superficial injury), PEP with either a basic two-drug or three-drug regimen is indicated, depending on the source patient’s viral load.3,5,6,8
If the source patient has unknown HIV status, two-drug PEP is indicated based on the source patient’s HIV risk factors. In such patients, rapid HIV testing also is indicated to aid in determining the need for PEP. When the source HIV status is unknown, PEP is indicated in settings where exposure to HIV-infected persons is likely.
If PEP is indicated, it should be started as quickly as possible. The 2005 U.S. Public Health Service Recommendations for PEP recommend initiating two nucleosides for low-risk exposures and two nucleosides plus a boosted protease inhibitor for high-risk exposures.
Examples of commonly used dual nucleoside regimens are Zidovudine plus Lamivudine (coformulated as Combivir) or Tenofovir plus Emtricitabine (coformulated as Truvada). Current recommendations indicate that PEP should be continued for four weeks, with concurrent clinical and laboratory evaluation for drug toxicity.
Hepatitis B virus. Numerous prospective studies have evaluated the post-exposure effectiveness of HBIG. When administered within 24 hours of exposure, HBIG might offer immediate passive protection against HBV infection. Additionally, if initiated within one week of percutaneous injury with a known HBV-positive source, multiple doses of HGIB provide an estimated 75% protection from transmission.
Although the combination of HBIG and the hepatitis vaccine B series has not been evaluated as PEP in the occupational setting, evidence in the perinatal setting suggests this regimen is more effective than HBIG alone.3,6,8
Hepatitis C virus. No PEP exists for HCV, and current recommendations for post-exposure management focus on early identification and treatment of chronic disease. There are insufficient data for a treatment recommendation for patients with acute HCV infection with no evidence of disease; the appropriate dosing of such a regimen is unknown. Further, evidence suggests that treatment started early in the course of chronic infection could be just as effective and might eliminate the need to treat persons whose infection will spontaneously resolve.7
Back to the Case
Your needlestick occurred while using a hollow-bore needle to cannulate a source patient’s vein, placing you at higher risk for seroconversion. You immediately reported the exposure to the department of occupational health at your hospital. The source patient’s HIV, HBV, and HCV serological statuses were tested, and the patient was found to be HBV-positive. After appropriate counseling, you decide to receive HGIB prophylaxis to reduce your chances of becoming infected with HBV infection.
Bottom Line
Healthcare workers who suffer occupational needlestick injuries require immediate identification and attention to avoid transmission of such infectious diseases as HIV, HBV, and HCV. Source patients should undergo rapid serological testing to determine appropriate PEP. TH
Dr. Zehnder is a hospitalist in the Section of Hospital Medicine at the University of Colorado Denver.
References
- Mangione CM, Gerberding JL, Cummings, SR. Occupational exposure to HIV: Frequency and rates of underreporting of percutaneous and mucocutaneous exposures by medical housestaff. Am J Med. 1991;90(1):85-90.
- Lee JM, Botteman MF, Nicklasson L, et al. Needlestick injury in acute care nurses caring for patients with diabetes mellitus: a retrospective study. Curr Med Res Opinion. 2005;21(5):741-747.
- Workbook for designing, implementing, and evaluating a sharps injury prevention program. Centers for Disease Control and Prevention website. Available at: www.cdc.gov/sharpssafety/pdf/WorkbookComplete.pdf. Accessed Sept. 13, 2010.
- Lee JM, Botteman MF, Xanthakos N, Nicklasson L. Needlestick injuries in the United States. Epidemiologic, economic, and quality of life issues. AAOHN J. 2005;53(3):117-133.
- Cardo DM, Culver DH, Ciesielski CA, et al. A case-control study of HIV seroconversion in health care workers after percutaneous exposure. Centers for Disease Control and Prevention Needlestick Surveillance Group. N Engl J Med. 1997;337(21):1485-1490.
- Exposure to blood: What healthcare personnel need to know. CDC website. Available at: www.cdc.gov/ncidod /dhqp/pdf/bbp/Exp_to_Blood.pdf. Accessed Aug. 31, 2010.
- Buergler JM, Kim R, Thisted RA, Cohn SJ, Lichtor JL, Roizen MF. Risk of human immunodeficiency virus in surgeons, anesthesiologists, and medical students. Anesth Analg. 1992;75(1):118-124.
- Updated U.S. Public Health Service guidelines for the management of occupational exposures to HBV, HCV, and HIV and recommendations for postexposure prophylaxis. CDC website. Available at: www.cdc.gov/mmwr/preview/mmwrhtml/rr5011a1.htm. Accessed Aug. 31, 2010.
- Puro V, Petrosillo N, Ippolito G. Risk of hepatitis C seroconversion after occupational exposure in health care workers. Italian Study Group on Occupational Risk of HIV and Other Bloodborne Infections. Am J Infect Control. 1995;23(5):273-277.
Case
While placing a central line, you sustain a needlestick. You’ve washed the area thoroughly with soap and water, but you are concerned about contracting a bloodborne pathogen. What is the risk of contracting such a pathogen, and what can be done to reduce this risk?
Overview
Needlestick injuries are a common occupational hazard in the hospital setting. According to the International Health Care Worker Safety Center (IHCWSC), approximately 295,000 hospital-based healthcare workers experience occupational percutaneous injuries annually. In 1991, Mangione et al surveyed internal-medicine house staff and found an annual incidence of 674 needlestick injuries per 1,000 participants.1 Other retrospective data estimate this risk to be as high as 839 per 1,000 healthcare workers annually.2 Evidence from the Centers for Disease Control and Prevention (CDC) in 2004 suggests that because these are only self-reported injuries, the annual incidence of such injuries is in fact much higher than the current estimates suggest.2,3,4
More than 20 bloodborne pathogens (see Table 1, right) might be transmitted from contaminated needles or sharps, including human immunodeficiency virus (HIV), hepatitis B virus (HBV), and hepatitis C virus (HCV). A quick and appropriate response to a needlestick injury can greatly decrease the risk of disease transmission following an occupational exposure to potentially infectious materials.
Review of the Data
After any needlestick injury, an affected healthcare worker should wash the area with soap and water immediately. There is no contraindication to using antiseptic solutions, but there is also no evidence to suggest that this reduces the rates of disease transmission.
As decisions for post-exposure prophylaxis often need to be made within hours, a healthcare worker should seek care in the facility areas responsible for managing occupational exposures. Healthcare providers should always be encouraged and supported to report all sharps-related injuries to such departments.
The source patient should be identified and evaluated for potentially transmissible diseases, including HIV, HBV, and HCV. If indicated, the source patient should then undergo appropriate serological testing, and any indicated antiviral prophylaxis should be initiated (see Table 2, p. 19).
Risk of Seroconversion
For all bloodborne pathogens, a needlestick injury carries a greater risk for transmission than other occupational exposures (e.g. mucous membrane exposure). If a needlestick injury occurs in the setting of an infected patient source, the risk of disease transmission varies for HIV, HBV, and HCV (see Table 3, p. 19). In general, risk for seroconversion is increased with a deep injury, an injury with a device visibly contaminated with the source patient’s blood, or an injury involving a needle placed in the source patient’s artery or vein.3,5,6
Human immunodeficiency virus. Contracting HIV after needlestick injury is rare. From 1981 to 2006, the CDC documented only 57 cases of HIV/AIDS in healthcare workers following occupational exposure and identified an additional “possible” 140 cases post-exposure.5,6 Of the 57 documented cases, 48 sustained a percutaneous injury.
Following needlestick injury involving a known HIV-positive source, the one-year risk of seroconversion has been estimated to be 0.3%.5,6 In 1997, Cardo and colleagues identified four factors associated with increased risk for seroconversion after a needlestick/sharps injury from a known positive-HIV source:
- Deep injury;
- Injury with a device visibly contaminated with the source patient’s blood;
- A procedure involving a needle placed in the source patient’s artery or vein; and
- Exposure to a source patient who died of AIDS in the two months following the occupational exposure.5
Hepatitis B virus. Wides-pread immunization of healthcare workers has led to a dramatic decline in occupationally acquired HBV. The CDC estimated that in 1985, approximately 12,500 new HBV infections occurred in healthcare workers.3 This estimate plummeted to approximately 500 new occupationally acquired HBV infections in 1997.3
Despite this, hospital-based healthcare personnel remain at risk for HBV transmission after a needlestick injury from a known positive patient source. Few studies have evaluated the occupational risk of HBV transmission after a needlestick injury. Buergler et al reported that following a needlestick injury involving a known HBV-positive source, the one-year risk of seroconversion was 0.76% to 7.35% for nonimmunized surgeons, and 0.23% to 2.28% for nonimmunized anesthesiologists.7
In the absence of post-exposure prophylaxis, an exposed healthcare worker has a 6% to 30% risk of becoming infected with HBV.3,8 The risk is greatest if the patient source is known to be hepatitis B e antigen-positive, a marker for greater disease infectivity. When given within one week of injury, post-exposure prophylaxis (PEP) with multiple doses of hepatitis B immune globulin (HBIG) provides an estimated 75% protection from transmission.
Healthcare workers who have received the hepatitis B vaccine and developed immunity have virtually no risk for infection.6,7
Hepatitis C virus. Prospective evaluation has demonstrated that the average risk of HCV transmission after percutaneous exposure to a known HCV-positive source is from 0% to 7%.3 The Italian Study Group on Occupational Risk of HIV and Other Bloodborne Infections evaluated HCV seroconversion within six months of a reported exposure with enzyme immunoassay and immunoblot assay. In this study, the authors found a seroconversion rate of 1.2%.9
Further, they suggested that HCV seroconversion only occurred from hollow-bore needles, as no seroconversions were noted in healthcare workers who sustained injuries with solid sharp objects.
Post-Exposure Management
The CDC does not recommend prophylaxis when source fluids make contact with intact skin. However, if a percutaneous occupational exposure has occurred, PEPs exist for HIV and HBV but not for HCV.3,6 If a source patient’s HIV, HBV, and HCV statuses are unknown, occupational-health personnel can interview the patient to evaluate his or her risks and initiate testing. Specific information about the time and nature of exposure should be documented.
When testing is indicated, it should be done following institutional and state-specific exposure-control policies and informed consent guidelines. In all situations, the decision to begin antiviral PEP should be carefully considered, weighing benefits of PEP versus the risks and toxicity of treatment.
Human immunodeficiency virus. If a source patient is known to be HIV-positive, has a positive rapid HIV test, or if HIV status cannot be quickly determined, PEP is indicated. Healthcare providers should be aware of rare cases in which the source patient initially tested HIV-seronegative but was subsequently found to have primary HIV infection.
Per 2004 CDC recommendations, PEP is indicated for all healthcare workers who sustain a percuanteous injury from a known HIV-positive source.3,8 For a less severe injury (e.g. solid needle or superficial injury), PEP with either a basic two-drug or three-drug regimen is indicated, depending on the source patient’s viral load.3,5,6,8
If the source patient has unknown HIV status, two-drug PEP is indicated based on the source patient’s HIV risk factors. In such patients, rapid HIV testing also is indicated to aid in determining the need for PEP. When the source HIV status is unknown, PEP is indicated in settings where exposure to HIV-infected persons is likely.
If PEP is indicated, it should be started as quickly as possible. The 2005 U.S. Public Health Service Recommendations for PEP recommend initiating two nucleosides for low-risk exposures and two nucleosides plus a boosted protease inhibitor for high-risk exposures.
Examples of commonly used dual nucleoside regimens are Zidovudine plus Lamivudine (coformulated as Combivir) or Tenofovir plus Emtricitabine (coformulated as Truvada). Current recommendations indicate that PEP should be continued for four weeks, with concurrent clinical and laboratory evaluation for drug toxicity.
Hepatitis B virus. Numerous prospective studies have evaluated the post-exposure effectiveness of HBIG. When administered within 24 hours of exposure, HBIG might offer immediate passive protection against HBV infection. Additionally, if initiated within one week of percutaneous injury with a known HBV-positive source, multiple doses of HGIB provide an estimated 75% protection from transmission.
Although the combination of HBIG and the hepatitis vaccine B series has not been evaluated as PEP in the occupational setting, evidence in the perinatal setting suggests this regimen is more effective than HBIG alone.3,6,8
Hepatitis C virus. No PEP exists for HCV, and current recommendations for post-exposure management focus on early identification and treatment of chronic disease. There are insufficient data for a treatment recommendation for patients with acute HCV infection with no evidence of disease; the appropriate dosing of such a regimen is unknown. Further, evidence suggests that treatment started early in the course of chronic infection could be just as effective and might eliminate the need to treat persons whose infection will spontaneously resolve.7
Back to the Case
Your needlestick occurred while using a hollow-bore needle to cannulate a source patient’s vein, placing you at higher risk for seroconversion. You immediately reported the exposure to the department of occupational health at your hospital. The source patient’s HIV, HBV, and HCV serological statuses were tested, and the patient was found to be HBV-positive. After appropriate counseling, you decide to receive HGIB prophylaxis to reduce your chances of becoming infected with HBV infection.
Bottom Line
Healthcare workers who suffer occupational needlestick injuries require immediate identification and attention to avoid transmission of such infectious diseases as HIV, HBV, and HCV. Source patients should undergo rapid serological testing to determine appropriate PEP. TH
Dr. Zehnder is a hospitalist in the Section of Hospital Medicine at the University of Colorado Denver.
References
- Mangione CM, Gerberding JL, Cummings, SR. Occupational exposure to HIV: Frequency and rates of underreporting of percutaneous and mucocutaneous exposures by medical housestaff. Am J Med. 1991;90(1):85-90.
- Lee JM, Botteman MF, Nicklasson L, et al. Needlestick injury in acute care nurses caring for patients with diabetes mellitus: a retrospective study. Curr Med Res Opinion. 2005;21(5):741-747.
- Workbook for designing, implementing, and evaluating a sharps injury prevention program. Centers for Disease Control and Prevention website. Available at: www.cdc.gov/sharpssafety/pdf/WorkbookComplete.pdf. Accessed Sept. 13, 2010.
- Lee JM, Botteman MF, Xanthakos N, Nicklasson L. Needlestick injuries in the United States. Epidemiologic, economic, and quality of life issues. AAOHN J. 2005;53(3):117-133.
- Cardo DM, Culver DH, Ciesielski CA, et al. A case-control study of HIV seroconversion in health care workers after percutaneous exposure. Centers for Disease Control and Prevention Needlestick Surveillance Group. N Engl J Med. 1997;337(21):1485-1490.
- Exposure to blood: What healthcare personnel need to know. CDC website. Available at: www.cdc.gov/ncidod /dhqp/pdf/bbp/Exp_to_Blood.pdf. Accessed Aug. 31, 2010.
- Buergler JM, Kim R, Thisted RA, Cohn SJ, Lichtor JL, Roizen MF. Risk of human immunodeficiency virus in surgeons, anesthesiologists, and medical students. Anesth Analg. 1992;75(1):118-124.
- Updated U.S. Public Health Service guidelines for the management of occupational exposures to HBV, HCV, and HIV and recommendations for postexposure prophylaxis. CDC website. Available at: www.cdc.gov/mmwr/preview/mmwrhtml/rr5011a1.htm. Accessed Aug. 31, 2010.
- Puro V, Petrosillo N, Ippolito G. Risk of hepatitis C seroconversion after occupational exposure in health care workers. Italian Study Group on Occupational Risk of HIV and Other Bloodborne Infections. Am J Infect Control. 1995;23(5):273-277.
How Should Hospitalized Patients with Long QT Syndrome Be Managed?
Case
You are asked to admit a 63-year-old male with a history of hypertension and osteoarthritis. The patient, who fell at home, is scheduled for open repair of his femoral neck fracture the following day. The patient reports tripping over his granddaughter’s toys and denies any associated symptoms around the time of his fall. An electrocardiogram (ECG) reveals a QTc (QT) interval of 480 ms. How should this hospitalized patient’s prolonged QT interval be managed?
Overview
Patients with a prolonged QT interval on routine ECG present an interesting dilemma for clinicians. Although QT prolongation—either congenital or acquired—has been associated with dysrhythmias, the risk of torsades de pointes and sudden cardiac death varies considerably based on myriad underlying factors.1 Therefore, the principle job of the clinician who has recognized QT prolongation is to assess and minimize the risk of the development of clinically significant dysrhythmias, and to be prepared to manage them should they arise.
The QT interval encompasses ventricular depolarization and repolarization. This ventricular action potential proceeds through five phases. The initial upstroke (phase 0) of depolarization occurs with the opening of Na+ channels, triggering the inward Na+ current (INa), and causes the interior of the myocytes to become positively charged. This is followed by initial repolarization (phase 1) when the opening of K+ channels causes an outward K+ current (Ito). Next, the plateau phase (phase 2) of the action potential follows with a balance of inward current through Ca2+channels (Ica-L) and outward current through slow rectifier K+ channels (IKs), and then later through delayed, rapid K+ rectifier channels (IKr). Then, the inward current is deactivated, while the outward current increases through the rapid delayed rectifier (IKr) and opening of inward rectifier channels (IK1) to complete repolarization (phase 3). Finally, the action potential returns to baseline (phase 4) and Na+ begins to enter the cell again (see Figure 1, above).
The long QT syndrome (LQTS) is defined by a defect in these cardiac ion channels, which leads to abnormal repolarization, usually lengthening the QT interval and thus predisposing to ventricular dysrhythmias.2 It is estimated that as many as 85% of these syndromes are inherited, and up to 15% are acquired or sporadic.3 Depending on the underlying etiology of the LQTS, manifestations might first be appreciated at any time from in utero through adulthood.4 Symptoms including palpitations, syncope, seizures, or cardiac arrest bring these patients to medical attention.3 These symptoms frequently elicit physical or emotional stress, but they can occur without obvious inciting triggers.5 A 20% mortality risk exists in patients who are symptomatic and untreated in the first year following diagnosis, and up to 50% within 10 years following diagnosis.4
How is Long QT Syndrome Diagnosed?
The LQTS diagnosis is based on clinical history in combination with ECG abnormalities.6 Important historical elements include symptoms of palpitations, syncope, seizures, or cardiac arrest.3 In addition, a family history of unexplained syncope or sudden death, especially at a young age, should raise LQTS suspicion.5
A variety of ECG findings can be witnessed in LQTS patients.4,5 Although the majority of patients have a QTc >440 ms, approximately one-third have a QTc ≤460 ms, and about 10% have normal QTc intervals.5 Other ECG abnormalities include notched, biphasic, or prolonged T-waves, and the presence of U-waves.4,5 Schwartz et al used these elements to publish criteria (see Table 1, right) that physicians can use to assess the probability that a patient has LQTS.7
Types of Long QT Syndromes
Because the risk of developing significant dysrhythmias with LQTS is dependent on both the actual QT interval, with risk for sudden cardiac death increased two to three times with QT >440 ms compared with QT <440 ms and the specific underlying genotype, it is important to have an understanding of congenital and acquired LQTS and the associated triggers for torsades de pointes.
Congenital LQTS
Congenital LQTS is caused by mutations in cardiac ion channel proteins, primarily sodium, and potassium channels.5,6 These defects either slow depolarization or lengthen repolarization, leading to heterogeneity of repolarization of the membrane.5 This, in turn, predisposes to ventricular dysrhythmias, including torsades de pointes and subsequent ventricular fibrillation and death.2 Currently, 12 genetic defects have been identified in LQTS. Hundreds of mutations have been described to cause these defects (see Table 2, right).8 Approximately 70% of congenital LQTS are caused by mutations in three genes and are classified as LQTS 1, LQTS 2, and LQTS 3.8 The other seven mutation types account for about 5% of cases; a quarter of LQTS cases have no identified genetic mutations.8
LQTS usually can be distinguished by clinical features and some ECG characteristics, but diagnosis of the specific type requires genetic testing.8,9 The most common types of LQTS are discussed below.
- Long QT1 is the most common type, occurring in approximately 40% to 50% of patients diagnosed with LQTS. It is characterized by a defect in the potassium channel alpha subunit leading to IKs reduction.9 These patients typically present with syncope or adrenergic-induced torsades, might have wide, broad-based T-waves on ECG, and respond well to beta-blocker therapy.6 Triggers for these patients include physical exertion or emotional stressors, particularly exercise and swimming. These patients typically present in early childhood.1
- Long QT2 occurs in 35% to 40% of patients and is characterized by a different defect in the alpha subunit of the potassium channel, which leads to reduced IKr.9 ECGs in LQTS2 can demonstrate low-amplitude and notched T-waves. Sudden catecholamine surges related to emotional stress or loud noises and bradycardia can trigger dysrhythmias in Long QT2.6 Thus, beta blockers reduce overall cardiac events in LQTS2 but less effectively than in LQTS1.6 These patients also present in childhood but typically are older than patients with LQTS1.6
- Long QT3 is less common than LQTS1 or LQTS2, at 2% to 8% of LQTS patients, but carries a higher mortality and is not treated effectively with beta blockers. LQTS3 is characterized by a defect in a sodium channel, causing a gain-of-function in the INa.4,9 These patients are more likely to have a fatal dysrhythmia while sleeping, are less susceptible to exercise-induced events, and have increased morbidity and mortality associated with bradycardia.4,9 ECG frequently reveals a relatively long ST segment, followed by a peaked and tall T-wave. Beta-blocker therapy can predispose to dysrhythmias in these patients; therefore, many of these patients will have pacemakers implanted as first-line therapy.6
While less common, Jervell and Lange Nielson syndrome is an autosomal recessive form of LQTS in which affected patients have homozygous mutations in the KCNQ1 or KCNE1 genes. This syndrome occurs in approximately 1% to 7% of LQTS patients, displays a typical QTc >550 ms, can be triggered by exercise and emotional stress, is associated with deafness, and carries a high risk of cardiac events at a young age.6
Acquired Syndromes
In addition to congenital LQTS, certain patients can acquire LQTS after being treated with particular drugs or having metabolic abnormalities, namely hypomagnesemia, hypocalcemia, and hypokalemia. Most experts think patients who “acquire” LQTS that predisposes to torsades de pointes have underlying structural heart disease or LQTS genetic carrier mutations that combine with transient initiating events (e.g., drugs or metabolic abnormalities) to induce dysrhythmias.1 In addition to certain drugs, cardiac ischemia, and electrolyte abnormalities, cocaine abuse, HIV, and subarachnoid hemorrhage can induce dysrhythmias in susceptible patients.5
Many types of drugs can cause a prolonged QT interval, and others should be avoided in patients with pre-existing prolonged QT (see Table 3, p. 17). Potentially offending drugs that are frequently encountered by inpatient physicians include amiodarone, diltiazem, erythromycin, clarithromycin, ciprofloxacin, fluoxetine, paroxetine, sertraline, haloperidol, ritonavir, and methadone.1 Additionally, drugs that cause electrolyte abnormalities (e.g., diuretics and lithium) should be monitored closely.
Overall, the goals of therapy in LQTS are:
- Decrease the risk of dysrhythmic events;
- Minimize adrenergic response;
- Shorten the QTc;
- Decrease the dispersion of refractoriness; and
- Improve the function of the ion channels.3
Supportive measures should be taken for patients who are acutely symptomatic from LQTS and associated torsades de pointes. In addition to immediate cardioversion for ongoing and hemodynamically significant torsades, intravenous magnesium should be administered, electrolytes corrected, and offending drugs discontinued.5 Temporary transvenous pacing at rates of approximately 100 beats per minute is highly effective in preventing short-term recurrence of torsades in congenital and acquired LQTS, especially in bradycardic patients.5 Isoproterenol infusion increases the heart rate and effectively prevents acute recurrence of torsades in patients with acquired LQTS, but it should be used with caution in patients with structural heart disease.5
Long-term strategies to manage LQTS include:
- Minimizing the risk of triggering cardiac events via adrenergic stimulation;
- Preventing ongoing dysrhythmias;
- Avoiding medications known to prolong the QT interval; and
- Maintaining normal electrolytes and minerals.5
Most patients with congenital long QT should be discouraged from participating in competitive sports, and patients should attempt to eliminate exposures to stress or sudden awakening, though this is not practical in all cases.5 Beta blockers generally are the first-line therapy and are more effective for LQT1 than LQT2 or LQT3.4,5 If patients are still symptomatic despite adequate medical therapy, or have survived cardiac arrest, they should be considered for ICD therapy.4,5 In addition, patients with profound bradycardia benefit from pacemaker implantation.5 Patients who remain symptomatic despite both beta blockade and ICD placement might find cervicothoracic sympathectomy curative.4,5
Perioperative Considerations
Although little data is available to guide physicians in the prevention of torsades de pointes during the course of anesthesia, there are a number of considerations that may reduce the chances of symptomatic dysrhythmias.
First, care should be taken to avoid dysrhythmia triggers in LQTS by providing a calm, quiet environment during induction, monitoring, and treating metabolic abnormalities, and providing an appropriate level of anesthesia.10 Beta-blocker therapy should be continued and potentially measured preoperatively by assessing heart rate response during stress testing.5 An implantable cardioverter-defibrillator (AICD) should be interrogated prior to surgery and inactivated during the operation.5
Finally, Kies et al have recommended general anesthesia with propofol for induction (or throughout), isoflurane as the volatile agent, vecuronium for muscle relaxation, and intravenous fentanyl for analgesia when possible.10
Back to the Case
While the patient had no genetic testing for LQTS, evaluation of previous ECGs demonstrated a prolonged QT interval. The hip fracture repair was considered an urgent procedure, which precluded the ability to undertake genetic testing and consideration for device implantation. The only medication that was known to increase the risk for dysrhythmias in this patient was his diuretic, with the attendant risk of electrolyte abnormalities.
Thus, the patient’s hydrochlorothiazide was discontinued and his pre-existing atenolol continued. The patient’s electrolytes and minerals were monitored closely, and magnesium was administered on the day of surgery. Anesthesia was made aware of the prolonged QT interval, such that they were able to minimize the risk for and anticipate the treatment of dysrhythmias. The patient tolerated the surgery and post-operative period without complication and was scheduled for an outpatient workup and management for his prolonged QT interval.
Bottom Line
Long QT syndrome is frequently genetic in origin, but it can be caused by certain medications and perturbations of electrolytes. Beta blockers are the first-line therapy for the majority of LQTS cases, along with discontinuation of drugs that might induce or aggravate the QT prolongation.
Patients who have had cardiac arrest or who remain symptomatic despite beta-blocker therapy should have an ICD implanted.
In the perioperative period, patients’ electrolytes should be monitored and kept within normal limits. If the patient is on a beta blocker, it should be continued, and the anesthesiologist should be made aware of the diagnosis so that the anesthethic plan can be optimized to prevent arrhythmic complications. TH
Dr. Kamali is a medical resident at the University of Colorado Denver. Dr. Stickrath is a hospitalist at the Veterans Affairs Medical Center in Denver and an instructor of medicine at UC Denver. Dr. Prochazka is director of ambulatory care at the Denver VA and professor of medicine at UC Denver. Dr. Varosy is director of cardiac electrophysiology at the Denver VA and assistant professor of medicine at UC Denver.
References
- Kao LW, Furbee BR. Drug-induced q-T prolongation. Med Clin North Am. 2005;89(6):1125-1144.
- Marchlinski F. Chapter 226, The Tachyarrhythmias; Harrison's Principles of Internal Medicine, 17e. Available at: www.accessmedicine.com/resourceTOC .aspx?resourceID=4. Accessed Nov. 21, 2009.
- Zareba W, Cygankiewicz I. Long QT syndrome and short QT syndrome. Prog Cardiovasc Dis. 2008; 51(3):264-278.
- Booker PD, Whyte SD, Ladusans EJ. Long QT syndrome and anaesthesia. Br J Anaesth. 2003;90(3):349-366.
- Khan IA. Long QT syndrome: diagnosis and management. Am Heart J. 2002;143(1):7-14.
- Morita H, Wu J, Zipes DP. The QT syndromes: long and short. Lancet. 2008;372(9640):750-763.
- Schwartz PJ, Moss AJ, Vincent GM, Crampton RS. Diagnostic criteria for the long QT syndrome. An update. Circulation. 1993;88(2):782-784.
- Kapa S, Tester DJ, Salisbury BA, et al. Genetic testing for long-QT syndrome: distinguishing pathogenic mutations from benign variants. Circulation. 2009;120(18):1752-1760.
- Modell SM, Lehmann MH. The long QT syndrome family of cardiac ion channelopathies: a HuGE review. Genet Med. 2006;8(3):143-155.
- Kies SJ, Pabelick CM, Hurley HA, White RD, Ackerman MJ. Anesthesia for patients with congenital long QT syndrome. Anesthesiology. 2005;102(1):204-210.
- Wisely NA, Shipton EA. Long QT syndrome and anaesthesia. Eur J Anaesthesiol. 2002;19(12):853-859.
Case
You are asked to admit a 63-year-old male with a history of hypertension and osteoarthritis. The patient, who fell at home, is scheduled for open repair of his femoral neck fracture the following day. The patient reports tripping over his granddaughter’s toys and denies any associated symptoms around the time of his fall. An electrocardiogram (ECG) reveals a QTc (QT) interval of 480 ms. How should this hospitalized patient’s prolonged QT interval be managed?
Overview
Patients with a prolonged QT interval on routine ECG present an interesting dilemma for clinicians. Although QT prolongation—either congenital or acquired—has been associated with dysrhythmias, the risk of torsades de pointes and sudden cardiac death varies considerably based on myriad underlying factors.1 Therefore, the principle job of the clinician who has recognized QT prolongation is to assess and minimize the risk of the development of clinically significant dysrhythmias, and to be prepared to manage them should they arise.
The QT interval encompasses ventricular depolarization and repolarization. This ventricular action potential proceeds through five phases. The initial upstroke (phase 0) of depolarization occurs with the opening of Na+ channels, triggering the inward Na+ current (INa), and causes the interior of the myocytes to become positively charged. This is followed by initial repolarization (phase 1) when the opening of K+ channels causes an outward K+ current (Ito). Next, the plateau phase (phase 2) of the action potential follows with a balance of inward current through Ca2+channels (Ica-L) and outward current through slow rectifier K+ channels (IKs), and then later through delayed, rapid K+ rectifier channels (IKr). Then, the inward current is deactivated, while the outward current increases through the rapid delayed rectifier (IKr) and opening of inward rectifier channels (IK1) to complete repolarization (phase 3). Finally, the action potential returns to baseline (phase 4) and Na+ begins to enter the cell again (see Figure 1, above).
The long QT syndrome (LQTS) is defined by a defect in these cardiac ion channels, which leads to abnormal repolarization, usually lengthening the QT interval and thus predisposing to ventricular dysrhythmias.2 It is estimated that as many as 85% of these syndromes are inherited, and up to 15% are acquired or sporadic.3 Depending on the underlying etiology of the LQTS, manifestations might first be appreciated at any time from in utero through adulthood.4 Symptoms including palpitations, syncope, seizures, or cardiac arrest bring these patients to medical attention.3 These symptoms frequently elicit physical or emotional stress, but they can occur without obvious inciting triggers.5 A 20% mortality risk exists in patients who are symptomatic and untreated in the first year following diagnosis, and up to 50% within 10 years following diagnosis.4
How is Long QT Syndrome Diagnosed?
The LQTS diagnosis is based on clinical history in combination with ECG abnormalities.6 Important historical elements include symptoms of palpitations, syncope, seizures, or cardiac arrest.3 In addition, a family history of unexplained syncope or sudden death, especially at a young age, should raise LQTS suspicion.5
A variety of ECG findings can be witnessed in LQTS patients.4,5 Although the majority of patients have a QTc >440 ms, approximately one-third have a QTc ≤460 ms, and about 10% have normal QTc intervals.5 Other ECG abnormalities include notched, biphasic, or prolonged T-waves, and the presence of U-waves.4,5 Schwartz et al used these elements to publish criteria (see Table 1, right) that physicians can use to assess the probability that a patient has LQTS.7
Types of Long QT Syndromes
Because the risk of developing significant dysrhythmias with LQTS is dependent on both the actual QT interval, with risk for sudden cardiac death increased two to three times with QT >440 ms compared with QT <440 ms and the specific underlying genotype, it is important to have an understanding of congenital and acquired LQTS and the associated triggers for torsades de pointes.
Congenital LQTS
Congenital LQTS is caused by mutations in cardiac ion channel proteins, primarily sodium, and potassium channels.5,6 These defects either slow depolarization or lengthen repolarization, leading to heterogeneity of repolarization of the membrane.5 This, in turn, predisposes to ventricular dysrhythmias, including torsades de pointes and subsequent ventricular fibrillation and death.2 Currently, 12 genetic defects have been identified in LQTS. Hundreds of mutations have been described to cause these defects (see Table 2, right).8 Approximately 70% of congenital LQTS are caused by mutations in three genes and are classified as LQTS 1, LQTS 2, and LQTS 3.8 The other seven mutation types account for about 5% of cases; a quarter of LQTS cases have no identified genetic mutations.8
LQTS usually can be distinguished by clinical features and some ECG characteristics, but diagnosis of the specific type requires genetic testing.8,9 The most common types of LQTS are discussed below.
- Long QT1 is the most common type, occurring in approximately 40% to 50% of patients diagnosed with LQTS. It is characterized by a defect in the potassium channel alpha subunit leading to IKs reduction.9 These patients typically present with syncope or adrenergic-induced torsades, might have wide, broad-based T-waves on ECG, and respond well to beta-blocker therapy.6 Triggers for these patients include physical exertion or emotional stressors, particularly exercise and swimming. These patients typically present in early childhood.1
- Long QT2 occurs in 35% to 40% of patients and is characterized by a different defect in the alpha subunit of the potassium channel, which leads to reduced IKr.9 ECGs in LQTS2 can demonstrate low-amplitude and notched T-waves. Sudden catecholamine surges related to emotional stress or loud noises and bradycardia can trigger dysrhythmias in Long QT2.6 Thus, beta blockers reduce overall cardiac events in LQTS2 but less effectively than in LQTS1.6 These patients also present in childhood but typically are older than patients with LQTS1.6
- Long QT3 is less common than LQTS1 or LQTS2, at 2% to 8% of LQTS patients, but carries a higher mortality and is not treated effectively with beta blockers. LQTS3 is characterized by a defect in a sodium channel, causing a gain-of-function in the INa.4,9 These patients are more likely to have a fatal dysrhythmia while sleeping, are less susceptible to exercise-induced events, and have increased morbidity and mortality associated with bradycardia.4,9 ECG frequently reveals a relatively long ST segment, followed by a peaked and tall T-wave. Beta-blocker therapy can predispose to dysrhythmias in these patients; therefore, many of these patients will have pacemakers implanted as first-line therapy.6
While less common, Jervell and Lange Nielson syndrome is an autosomal recessive form of LQTS in which affected patients have homozygous mutations in the KCNQ1 or KCNE1 genes. This syndrome occurs in approximately 1% to 7% of LQTS patients, displays a typical QTc >550 ms, can be triggered by exercise and emotional stress, is associated with deafness, and carries a high risk of cardiac events at a young age.6
Acquired Syndromes
In addition to congenital LQTS, certain patients can acquire LQTS after being treated with particular drugs or having metabolic abnormalities, namely hypomagnesemia, hypocalcemia, and hypokalemia. Most experts think patients who “acquire” LQTS that predisposes to torsades de pointes have underlying structural heart disease or LQTS genetic carrier mutations that combine with transient initiating events (e.g., drugs or metabolic abnormalities) to induce dysrhythmias.1 In addition to certain drugs, cardiac ischemia, and electrolyte abnormalities, cocaine abuse, HIV, and subarachnoid hemorrhage can induce dysrhythmias in susceptible patients.5
Many types of drugs can cause a prolonged QT interval, and others should be avoided in patients with pre-existing prolonged QT (see Table 3, p. 17). Potentially offending drugs that are frequently encountered by inpatient physicians include amiodarone, diltiazem, erythromycin, clarithromycin, ciprofloxacin, fluoxetine, paroxetine, sertraline, haloperidol, ritonavir, and methadone.1 Additionally, drugs that cause electrolyte abnormalities (e.g., diuretics and lithium) should be monitored closely.
Overall, the goals of therapy in LQTS are:
- Decrease the risk of dysrhythmic events;
- Minimize adrenergic response;
- Shorten the QTc;
- Decrease the dispersion of refractoriness; and
- Improve the function of the ion channels.3
Supportive measures should be taken for patients who are acutely symptomatic from LQTS and associated torsades de pointes. In addition to immediate cardioversion for ongoing and hemodynamically significant torsades, intravenous magnesium should be administered, electrolytes corrected, and offending drugs discontinued.5 Temporary transvenous pacing at rates of approximately 100 beats per minute is highly effective in preventing short-term recurrence of torsades in congenital and acquired LQTS, especially in bradycardic patients.5 Isoproterenol infusion increases the heart rate and effectively prevents acute recurrence of torsades in patients with acquired LQTS, but it should be used with caution in patients with structural heart disease.5
Long-term strategies to manage LQTS include:
- Minimizing the risk of triggering cardiac events via adrenergic stimulation;
- Preventing ongoing dysrhythmias;
- Avoiding medications known to prolong the QT interval; and
- Maintaining normal electrolytes and minerals.5
Most patients with congenital long QT should be discouraged from participating in competitive sports, and patients should attempt to eliminate exposures to stress or sudden awakening, though this is not practical in all cases.5 Beta blockers generally are the first-line therapy and are more effective for LQT1 than LQT2 or LQT3.4,5 If patients are still symptomatic despite adequate medical therapy, or have survived cardiac arrest, they should be considered for ICD therapy.4,5 In addition, patients with profound bradycardia benefit from pacemaker implantation.5 Patients who remain symptomatic despite both beta blockade and ICD placement might find cervicothoracic sympathectomy curative.4,5
Perioperative Considerations
Although little data is available to guide physicians in the prevention of torsades de pointes during the course of anesthesia, there are a number of considerations that may reduce the chances of symptomatic dysrhythmias.
First, care should be taken to avoid dysrhythmia triggers in LQTS by providing a calm, quiet environment during induction, monitoring, and treating metabolic abnormalities, and providing an appropriate level of anesthesia.10 Beta-blocker therapy should be continued and potentially measured preoperatively by assessing heart rate response during stress testing.5 An implantable cardioverter-defibrillator (AICD) should be interrogated prior to surgery and inactivated during the operation.5
Finally, Kies et al have recommended general anesthesia with propofol for induction (or throughout), isoflurane as the volatile agent, vecuronium for muscle relaxation, and intravenous fentanyl for analgesia when possible.10
Back to the Case
While the patient had no genetic testing for LQTS, evaluation of previous ECGs demonstrated a prolonged QT interval. The hip fracture repair was considered an urgent procedure, which precluded the ability to undertake genetic testing and consideration for device implantation. The only medication that was known to increase the risk for dysrhythmias in this patient was his diuretic, with the attendant risk of electrolyte abnormalities.
Thus, the patient’s hydrochlorothiazide was discontinued and his pre-existing atenolol continued. The patient’s electrolytes and minerals were monitored closely, and magnesium was administered on the day of surgery. Anesthesia was made aware of the prolonged QT interval, such that they were able to minimize the risk for and anticipate the treatment of dysrhythmias. The patient tolerated the surgery and post-operative period without complication and was scheduled for an outpatient workup and management for his prolonged QT interval.
Bottom Line
Long QT syndrome is frequently genetic in origin, but it can be caused by certain medications and perturbations of electrolytes. Beta blockers are the first-line therapy for the majority of LQTS cases, along with discontinuation of drugs that might induce or aggravate the QT prolongation.
Patients who have had cardiac arrest or who remain symptomatic despite beta-blocker therapy should have an ICD implanted.
In the perioperative period, patients’ electrolytes should be monitored and kept within normal limits. If the patient is on a beta blocker, it should be continued, and the anesthesiologist should be made aware of the diagnosis so that the anesthethic plan can be optimized to prevent arrhythmic complications. TH
Dr. Kamali is a medical resident at the University of Colorado Denver. Dr. Stickrath is a hospitalist at the Veterans Affairs Medical Center in Denver and an instructor of medicine at UC Denver. Dr. Prochazka is director of ambulatory care at the Denver VA and professor of medicine at UC Denver. Dr. Varosy is director of cardiac electrophysiology at the Denver VA and assistant professor of medicine at UC Denver.
References
- Kao LW, Furbee BR. Drug-induced q-T prolongation. Med Clin North Am. 2005;89(6):1125-1144.
- Marchlinski F. Chapter 226, The Tachyarrhythmias; Harrison's Principles of Internal Medicine, 17e. Available at: www.accessmedicine.com/resourceTOC .aspx?resourceID=4. Accessed Nov. 21, 2009.
- Zareba W, Cygankiewicz I. Long QT syndrome and short QT syndrome. Prog Cardiovasc Dis. 2008; 51(3):264-278.
- Booker PD, Whyte SD, Ladusans EJ. Long QT syndrome and anaesthesia. Br J Anaesth. 2003;90(3):349-366.
- Khan IA. Long QT syndrome: diagnosis and management. Am Heart J. 2002;143(1):7-14.
- Morita H, Wu J, Zipes DP. The QT syndromes: long and short. Lancet. 2008;372(9640):750-763.
- Schwartz PJ, Moss AJ, Vincent GM, Crampton RS. Diagnostic criteria for the long QT syndrome. An update. Circulation. 1993;88(2):782-784.
- Kapa S, Tester DJ, Salisbury BA, et al. Genetic testing for long-QT syndrome: distinguishing pathogenic mutations from benign variants. Circulation. 2009;120(18):1752-1760.
- Modell SM, Lehmann MH. The long QT syndrome family of cardiac ion channelopathies: a HuGE review. Genet Med. 2006;8(3):143-155.
- Kies SJ, Pabelick CM, Hurley HA, White RD, Ackerman MJ. Anesthesia for patients with congenital long QT syndrome. Anesthesiology. 2005;102(1):204-210.
- Wisely NA, Shipton EA. Long QT syndrome and anaesthesia. Eur J Anaesthesiol. 2002;19(12):853-859.
Case
You are asked to admit a 63-year-old male with a history of hypertension and osteoarthritis. The patient, who fell at home, is scheduled for open repair of his femoral neck fracture the following day. The patient reports tripping over his granddaughter’s toys and denies any associated symptoms around the time of his fall. An electrocardiogram (ECG) reveals a QTc (QT) interval of 480 ms. How should this hospitalized patient’s prolonged QT interval be managed?
Overview
Patients with a prolonged QT interval on routine ECG present an interesting dilemma for clinicians. Although QT prolongation—either congenital or acquired—has been associated with dysrhythmias, the risk of torsades de pointes and sudden cardiac death varies considerably based on myriad underlying factors.1 Therefore, the principle job of the clinician who has recognized QT prolongation is to assess and minimize the risk of the development of clinically significant dysrhythmias, and to be prepared to manage them should they arise.
The QT interval encompasses ventricular depolarization and repolarization. This ventricular action potential proceeds through five phases. The initial upstroke (phase 0) of depolarization occurs with the opening of Na+ channels, triggering the inward Na+ current (INa), and causes the interior of the myocytes to become positively charged. This is followed by initial repolarization (phase 1) when the opening of K+ channels causes an outward K+ current (Ito). Next, the plateau phase (phase 2) of the action potential follows with a balance of inward current through Ca2+channels (Ica-L) and outward current through slow rectifier K+ channels (IKs), and then later through delayed, rapid K+ rectifier channels (IKr). Then, the inward current is deactivated, while the outward current increases through the rapid delayed rectifier (IKr) and opening of inward rectifier channels (IK1) to complete repolarization (phase 3). Finally, the action potential returns to baseline (phase 4) and Na+ begins to enter the cell again (see Figure 1, above).
The long QT syndrome (LQTS) is defined by a defect in these cardiac ion channels, which leads to abnormal repolarization, usually lengthening the QT interval and thus predisposing to ventricular dysrhythmias.2 It is estimated that as many as 85% of these syndromes are inherited, and up to 15% are acquired or sporadic.3 Depending on the underlying etiology of the LQTS, manifestations might first be appreciated at any time from in utero through adulthood.4 Symptoms including palpitations, syncope, seizures, or cardiac arrest bring these patients to medical attention.3 These symptoms frequently elicit physical or emotional stress, but they can occur without obvious inciting triggers.5 A 20% mortality risk exists in patients who are symptomatic and untreated in the first year following diagnosis, and up to 50% within 10 years following diagnosis.4
How is Long QT Syndrome Diagnosed?
The LQTS diagnosis is based on clinical history in combination with ECG abnormalities.6 Important historical elements include symptoms of palpitations, syncope, seizures, or cardiac arrest.3 In addition, a family history of unexplained syncope or sudden death, especially at a young age, should raise LQTS suspicion.5
A variety of ECG findings can be witnessed in LQTS patients.4,5 Although the majority of patients have a QTc >440 ms, approximately one-third have a QTc ≤460 ms, and about 10% have normal QTc intervals.5 Other ECG abnormalities include notched, biphasic, or prolonged T-waves, and the presence of U-waves.4,5 Schwartz et al used these elements to publish criteria (see Table 1, right) that physicians can use to assess the probability that a patient has LQTS.7
Types of Long QT Syndromes
Because the risk of developing significant dysrhythmias with LQTS is dependent on both the actual QT interval, with risk for sudden cardiac death increased two to three times with QT >440 ms compared with QT <440 ms and the specific underlying genotype, it is important to have an understanding of congenital and acquired LQTS and the associated triggers for torsades de pointes.
Congenital LQTS
Congenital LQTS is caused by mutations in cardiac ion channel proteins, primarily sodium, and potassium channels.5,6 These defects either slow depolarization or lengthen repolarization, leading to heterogeneity of repolarization of the membrane.5 This, in turn, predisposes to ventricular dysrhythmias, including torsades de pointes and subsequent ventricular fibrillation and death.2 Currently, 12 genetic defects have been identified in LQTS. Hundreds of mutations have been described to cause these defects (see Table 2, right).8 Approximately 70% of congenital LQTS are caused by mutations in three genes and are classified as LQTS 1, LQTS 2, and LQTS 3.8 The other seven mutation types account for about 5% of cases; a quarter of LQTS cases have no identified genetic mutations.8
LQTS usually can be distinguished by clinical features and some ECG characteristics, but diagnosis of the specific type requires genetic testing.8,9 The most common types of LQTS are discussed below.
- Long QT1 is the most common type, occurring in approximately 40% to 50% of patients diagnosed with LQTS. It is characterized by a defect in the potassium channel alpha subunit leading to IKs reduction.9 These patients typically present with syncope or adrenergic-induced torsades, might have wide, broad-based T-waves on ECG, and respond well to beta-blocker therapy.6 Triggers for these patients include physical exertion or emotional stressors, particularly exercise and swimming. These patients typically present in early childhood.1
- Long QT2 occurs in 35% to 40% of patients and is characterized by a different defect in the alpha subunit of the potassium channel, which leads to reduced IKr.9 ECGs in LQTS2 can demonstrate low-amplitude and notched T-waves. Sudden catecholamine surges related to emotional stress or loud noises and bradycardia can trigger dysrhythmias in Long QT2.6 Thus, beta blockers reduce overall cardiac events in LQTS2 but less effectively than in LQTS1.6 These patients also present in childhood but typically are older than patients with LQTS1.6
- Long QT3 is less common than LQTS1 or LQTS2, at 2% to 8% of LQTS patients, but carries a higher mortality and is not treated effectively with beta blockers. LQTS3 is characterized by a defect in a sodium channel, causing a gain-of-function in the INa.4,9 These patients are more likely to have a fatal dysrhythmia while sleeping, are less susceptible to exercise-induced events, and have increased morbidity and mortality associated with bradycardia.4,9 ECG frequently reveals a relatively long ST segment, followed by a peaked and tall T-wave. Beta-blocker therapy can predispose to dysrhythmias in these patients; therefore, many of these patients will have pacemakers implanted as first-line therapy.6
While less common, Jervell and Lange Nielson syndrome is an autosomal recessive form of LQTS in which affected patients have homozygous mutations in the KCNQ1 or KCNE1 genes. This syndrome occurs in approximately 1% to 7% of LQTS patients, displays a typical QTc >550 ms, can be triggered by exercise and emotional stress, is associated with deafness, and carries a high risk of cardiac events at a young age.6
Acquired Syndromes
In addition to congenital LQTS, certain patients can acquire LQTS after being treated with particular drugs or having metabolic abnormalities, namely hypomagnesemia, hypocalcemia, and hypokalemia. Most experts think patients who “acquire” LQTS that predisposes to torsades de pointes have underlying structural heart disease or LQTS genetic carrier mutations that combine with transient initiating events (e.g., drugs or metabolic abnormalities) to induce dysrhythmias.1 In addition to certain drugs, cardiac ischemia, and electrolyte abnormalities, cocaine abuse, HIV, and subarachnoid hemorrhage can induce dysrhythmias in susceptible patients.5
Many types of drugs can cause a prolonged QT interval, and others should be avoided in patients with pre-existing prolonged QT (see Table 3, p. 17). Potentially offending drugs that are frequently encountered by inpatient physicians include amiodarone, diltiazem, erythromycin, clarithromycin, ciprofloxacin, fluoxetine, paroxetine, sertraline, haloperidol, ritonavir, and methadone.1 Additionally, drugs that cause electrolyte abnormalities (e.g., diuretics and lithium) should be monitored closely.
Overall, the goals of therapy in LQTS are:
- Decrease the risk of dysrhythmic events;
- Minimize adrenergic response;
- Shorten the QTc;
- Decrease the dispersion of refractoriness; and
- Improve the function of the ion channels.3
Supportive measures should be taken for patients who are acutely symptomatic from LQTS and associated torsades de pointes. In addition to immediate cardioversion for ongoing and hemodynamically significant torsades, intravenous magnesium should be administered, electrolytes corrected, and offending drugs discontinued.5 Temporary transvenous pacing at rates of approximately 100 beats per minute is highly effective in preventing short-term recurrence of torsades in congenital and acquired LQTS, especially in bradycardic patients.5 Isoproterenol infusion increases the heart rate and effectively prevents acute recurrence of torsades in patients with acquired LQTS, but it should be used with caution in patients with structural heart disease.5
Long-term strategies to manage LQTS include:
- Minimizing the risk of triggering cardiac events via adrenergic stimulation;
- Preventing ongoing dysrhythmias;
- Avoiding medications known to prolong the QT interval; and
- Maintaining normal electrolytes and minerals.5
Most patients with congenital long QT should be discouraged from participating in competitive sports, and patients should attempt to eliminate exposures to stress or sudden awakening, though this is not practical in all cases.5 Beta blockers generally are the first-line therapy and are more effective for LQT1 than LQT2 or LQT3.4,5 If patients are still symptomatic despite adequate medical therapy, or have survived cardiac arrest, they should be considered for ICD therapy.4,5 In addition, patients with profound bradycardia benefit from pacemaker implantation.5 Patients who remain symptomatic despite both beta blockade and ICD placement might find cervicothoracic sympathectomy curative.4,5
Perioperative Considerations
Although little data is available to guide physicians in the prevention of torsades de pointes during the course of anesthesia, there are a number of considerations that may reduce the chances of symptomatic dysrhythmias.
First, care should be taken to avoid dysrhythmia triggers in LQTS by providing a calm, quiet environment during induction, monitoring, and treating metabolic abnormalities, and providing an appropriate level of anesthesia.10 Beta-blocker therapy should be continued and potentially measured preoperatively by assessing heart rate response during stress testing.5 An implantable cardioverter-defibrillator (AICD) should be interrogated prior to surgery and inactivated during the operation.5
Finally, Kies et al have recommended general anesthesia with propofol for induction (or throughout), isoflurane as the volatile agent, vecuronium for muscle relaxation, and intravenous fentanyl for analgesia when possible.10
Back to the Case
While the patient had no genetic testing for LQTS, evaluation of previous ECGs demonstrated a prolonged QT interval. The hip fracture repair was considered an urgent procedure, which precluded the ability to undertake genetic testing and consideration for device implantation. The only medication that was known to increase the risk for dysrhythmias in this patient was his diuretic, with the attendant risk of electrolyte abnormalities.
Thus, the patient’s hydrochlorothiazide was discontinued and his pre-existing atenolol continued. The patient’s electrolytes and minerals were monitored closely, and magnesium was administered on the day of surgery. Anesthesia was made aware of the prolonged QT interval, such that they were able to minimize the risk for and anticipate the treatment of dysrhythmias. The patient tolerated the surgery and post-operative period without complication and was scheduled for an outpatient workup and management for his prolonged QT interval.
Bottom Line
Long QT syndrome is frequently genetic in origin, but it can be caused by certain medications and perturbations of electrolytes. Beta blockers are the first-line therapy for the majority of LQTS cases, along with discontinuation of drugs that might induce or aggravate the QT prolongation.
Patients who have had cardiac arrest or who remain symptomatic despite beta-blocker therapy should have an ICD implanted.
In the perioperative period, patients’ electrolytes should be monitored and kept within normal limits. If the patient is on a beta blocker, it should be continued, and the anesthesiologist should be made aware of the diagnosis so that the anesthethic plan can be optimized to prevent arrhythmic complications. TH
Dr. Kamali is a medical resident at the University of Colorado Denver. Dr. Stickrath is a hospitalist at the Veterans Affairs Medical Center in Denver and an instructor of medicine at UC Denver. Dr. Prochazka is director of ambulatory care at the Denver VA and professor of medicine at UC Denver. Dr. Varosy is director of cardiac electrophysiology at the Denver VA and assistant professor of medicine at UC Denver.
References
- Kao LW, Furbee BR. Drug-induced q-T prolongation. Med Clin North Am. 2005;89(6):1125-1144.
- Marchlinski F. Chapter 226, The Tachyarrhythmias; Harrison's Principles of Internal Medicine, 17e. Available at: www.accessmedicine.com/resourceTOC .aspx?resourceID=4. Accessed Nov. 21, 2009.
- Zareba W, Cygankiewicz I. Long QT syndrome and short QT syndrome. Prog Cardiovasc Dis. 2008; 51(3):264-278.
- Booker PD, Whyte SD, Ladusans EJ. Long QT syndrome and anaesthesia. Br J Anaesth. 2003;90(3):349-366.
- Khan IA. Long QT syndrome: diagnosis and management. Am Heart J. 2002;143(1):7-14.
- Morita H, Wu J, Zipes DP. The QT syndromes: long and short. Lancet. 2008;372(9640):750-763.
- Schwartz PJ, Moss AJ, Vincent GM, Crampton RS. Diagnostic criteria for the long QT syndrome. An update. Circulation. 1993;88(2):782-784.
- Kapa S, Tester DJ, Salisbury BA, et al. Genetic testing for long-QT syndrome: distinguishing pathogenic mutations from benign variants. Circulation. 2009;120(18):1752-1760.
- Modell SM, Lehmann MH. The long QT syndrome family of cardiac ion channelopathies: a HuGE review. Genet Med. 2006;8(3):143-155.
- Kies SJ, Pabelick CM, Hurley HA, White RD, Ackerman MJ. Anesthesia for patients with congenital long QT syndrome. Anesthesiology. 2005;102(1):204-210.
- Wisely NA, Shipton EA. Long QT syndrome and anaesthesia. Eur J Anaesthesiol. 2002;19(12):853-859.