User login
Kidney transplant: New opportunities and challenges
Much has improved in renal transplantation over the past 20 years. The focus has shifted to using stronger immunotherapy rather than trying to minimize it. There has been increasing recognition of infection and ways to prevent and treat it. Induction therapy now has greater emphasis so that maintenance therapy can be eased, with the aim of reducing long-term toxicity. Perhaps the biggest change is the practice of screening for donor-specific antibodies at the time of transplant so that predictable problems can be prevented or better handled if they occur. Such advances have helped patients directly and by extending the life of their transplanted organs.
LONGER SURVIVAL
As early as the 1990s, it was recognized that kidney transplant offers a survival advantage for patients with end-stage renal disease over maintenance on dialysis.1 Although the risk of death is higher immediately after transplant, within a few months it becomes much lower than for patients on dialysis. Survival varies according to the health of the patient and the quality of the transplanted organ.
In general, patients who obtain the greatest benefit from transplants in terms of years of life gained are those with diabetes, especially those who are younger. Those ages 20 to 39 live about 8 years on dialysis vs 25 years after transplant.
CONTRAINDICATIONS TO TRANSPLANT
There are multiple contraindications to a solitary kidney transplant (Table 1), including smoking. Most transplant centers require that smokers quit before transplant. Long-standing smokers almost double their risk of a cardiac event after transplant and double their rate of malignancy. Active smoking at the time of transplant is associated with twice the risk of death by 10 years after transplant compared with that of nonsmokers.2 Cotinine testing can detect whether a patient is an active smoker.
WAITING-LIST CONSIDERATIONS
Organs are scarce
The number of patients on the kidney waiting list has increased rapidly in the last few decades, while the number of transplants performed each year has remained about the same. In 2016, about 100,000 patients were on the list, but only about 19,000 transplants were performed.3 Wait times, especially for deceased-donor organs, have increased to about 6 years, varying by blood type and geographic region.
Waiting-list placement
Placement on the waiting list for a deceased-donor kidney transplant occurs when a patient has an estimated glomerular filtration rate (GFR) of 20 mL/min/1.73 m2 or less, although referral to the list can be made earlier. Early listing remains advantageous, as total time on the list will be counted before starting dialysis. “Preemptive transplant” means the patient had no dialysis before transplant; this applies to about 10% of transplant recipients. These patients tend to fare the best and are usually recipients of a living-donor organ.
Most patients do not receive a transplant until the GFR is less than 15 mL/min/1.73 m2.
Since 2014, wait time has been measured from the beginning of dialysis rather than the date of waiting-list placement in patients who are listed after starting dialysis therapy. This approach is more fair but sometimes introduces problems. A patient who did not previously know about the list may suddenly jump to the head of the line after 10 years of dialysis, by which time comorbidities associated with long-term dialysis make the patient less likely to gain as much benefit from a transplant as people lower on the list. Time on dialysis, or “dialysis vintage,” predicts patient and kidney survival after transplant, with reduced survival associated with increasing time on dialysis.4
Shorter wait for a suboptimal kidney
The aging population has increased the number of older patients being listed for transplant, presenting multiple challenges. Patients age 65 or older have a 50% chance of dying before they receive a transplant during a 5-year wait.
A patient may shorten the wait by joining the list for a suboptimal organ. All deceased-donor organs are given a Kidney Donor Profile Index score, which predicts the longevity of an organ after transplant. The score is determined by donor age, kidney function based on the serum creatinine at the time of death, and other donor factors.
A kidney with a score higher than 85% is likely to function longer than only 15% of available kidneys. Patients who receive a kidney with that score have a longer period of risk of death soon after transplant and a slightly higher risk of death in the long term than patients who receive a healthier kidney, although on average they still do better than patients on dialysis.5
Older patients should be encouraged to sign up for both the regular waiting list and the suboptimal kidney waiting list to reduce the risk of dying before they get a kidney.
LIVING-DONOR ORGAN TRANSPLANT
Many advantages
Living-donor organ transplant is associated with a better survival rate than deceased-donor organ transplant, and the advantage becomes greater over time. At 1 year, patient survival is more than 90% in both groups, but by 5 years about 80% of patients with a living-donor organ are still alive vs only about 65% of patients with a deceased-donor organ.
The waiting time for a living-donor transplant may be only weeks to months, rather than years. Because increasing time on dialysis predicts worse patient and graft survival after transplant, the shorter wait time is a big advantage. In addition, because the donor and recipient are typically in adjacent operating rooms, the organ sustains less ischemic damage. In general, the kidney quality is better from healthy donors, resulting in superior function early on and longer graft survival by an average of 4 years. If the living donor is related to the recipient, human leukocyte antigen matching also tends to be better and predicts better outcomes.
Special challenges
Opting for a living-donor organ also entails special challenges. In addition to the ethical issues surrounding living-donor organ donation, an appropriate donor must be found. Donors must be highly motivated and pass physical, laboratory, and psychological evaluations.
For older patients, if the donor is a spouse or close friend, he or she is also likely to be older, making the organ less viable than one from a younger person. Even an adult child may not be an ideal donor if there is a family propensity to kidney disease, such as diabetic nephropathy. No test is available to determine the risk for future diabetes, but it is known to run in families.
POTENT IMMUNOSUPPRESSION
Induction therapy
Induction therapy with antithymocyte globulin or basiliximab provides intense immunosuppression to prevent acute rejection during the early posttransplant period.
Antithymocyte globulin is a potent agent that contains antibodies directed at T cells, B cells, neutrophils, platelets, adhesion molecules, and complement. It binds T cells and removes them from circulation by opsonization in splenic and lymphoid tissue. The immunosuppressive effect is sustained for at least 2 to 3 months after a series of injections (dosage 1.5 mg/kg/day, usually for 4 to 10 doses). Antithymocyte globulin is also used to treat acute rejection, especially high-grade rejection for which steroid therapy is likely to be insufficient.
Basiliximab consists of antibodies to the interleukin 2 (IL-2) receptor of T cells. Binding to T cells prevents their activation rather than removing them from circulation. The drug prevents rejection, with 30% relative reduction in early studies compared with placebo. However, it is ineffective in reversing established rejection. Dosage is 20 mg at day 0 and day 4, which provides receptor saturation for 30 to 45 days.
Basiliximab is also sometimes used off-label for patients who need to discontinue a calcineurin inhibitor (ie, tacrolimus or cyclosporine). In such cases, normal therapy is put on hold while basiliximab is given for 1 or 2 doses. Case series have been reported for this use, particularly for patients with a heart and liver transplant who develop acute kidney injury while hospitalized.6,7
Antithymocyte globulin is more effective but also more risky. Brennan et al8 randomized 278 transplant recipients to either antithymocyte globulin or basiliximab. Patients in the antithymocyte globulin group had a 16% rejection rate vs 26% in the basiliximab group.
Antithymocyte globulin therapy is associated with multiple adverse effects, including fever and chills, pulmonary edema, and long-standing immunosuppressive effects such as increased risk of lymphoma and cytomegalovirus (CMV) infection. Basiliximab side-effect profiles are similar to those of placebo.
Maintenance therapy
The calcineurin inhibitors cyclosporine and tacrolimus remain the standard of care in kidney transplant despite multiple drug interactions and side effects that include renal toxicity and fibrosis. Cyclosporine and tacrolimus both bind intracellular immunophilins and thereby prevent transcription of IL-2 and production of T cells. The drugs work similarly but have different binding sites. Cyclosporine has largely been replaced by tacrolimus because its reliability of dosing and higher potency are associated with lower rejection rates.
Tacrolimus is typically given twice daily (1–6 mg/dose). Twelve-hour trough levels are followed (target: 8–12 ng/mL early on, then 5–8 ng/mL after 3 months posttransplant). Side effects include hypertension and hypercholesterolemia, but less so than with cyclosporine. On the other hand, hyperglycemia tends to be worse with tacrolimus than with cyclosporine, and combining tacrolimus with steroids frequently leads to diabetes. Tacrolimus can also cause acute and chronic renal failure, especially at high drug levels, as well as neurotoxicity, tremors, and hair loss.
Cyclosporine, tacrolimus, and sirolimus (not a calcineurin inhibitor) are metabolized through the same cytochrome P450 pathway (CYP3A4), so they have common drug interactions (Table 2).
Mycophenolate mofetil is typically used as an adjunct therapy (500–1,000 mg twice daily). It is also used for other kidney diseases before transplant, including lupus nephritis. Transplanted kidney rejection rates with mycophenolate mofetil with steroids are about 40%, so the drug is not potent enough to be used without a calcineurin inhibitor.
Side effects include gastrointestinal toxicity in up to 20% of patients, and leukopenia, which is associated with viral infections.
CORONARY ARTERY DISEASE IS COMMON WITH DIALYSIS
Coronary artery disease is highly associated with end-stage kidney disease and occurs in as many as 85% of older patients with diabetes on dialysis. Although patients with end-stage kidney disease tend to have more numerous and severe atherosclerotic lesions compared with the general population, justifying aggressive management, cardiac care tends to be conservative in patients on dialysis.9
Death from acute myocardial infarction occurs in about 20% to 30% of patients on dialysis vs about 2% of patients with normal renal function. Five years after myocardial infarction, survival is only about 30% in patients on dialysis.9
There are many explanations for excess coronary artery disease in patients on dialysis. In addition to the traditional cardiovascular risk factors of diabetes, hypertension, and preexisting coronary artery disease, patients are in a proinflammatory uremic state and have high levels of phosphorus and fibroblast growth factor 23 that contribute to vascular calcification. Almost all patients have high homocysteine levels and hemodynamic instability, particularly if they are on hemodialysis.
Pretransplant evaluation for heart disease
Patients on the kidney transplant waiting list are screened aggressively for heart disease. A history of myocardial infarction usually results in removal from the list. All patients have an initial electrocardiogram and echocardiogram. Thallium or echocardiographic stress testing is used for patients who are age 50 and older, have diabetes, or have had dialysis for many years. Patients with evidence of ischemia undergo catheterization.
Patients are also screened with computed tomography before transplant. Because the kidney is typically anastomosed to the iliac artery and vein, heavy calcification of the iliac artery can make the procedure too difficult to perform.
Reduced long-term risk of myocardial infarction after transplant
Kasiske et al10 analyzed data from more than 50,000 patients from the US Renal Data System and found that, for about the first year after transplant, patients who underwent kidney transplant were more likely to have a myocardial infarction than those on dialysis. After that, they fared better than patients who remained on dialysis. Those with a living-donor transplant were less likely at all times to have a myocardial infarction than those with a deceased-donor transplant. By 3 years after transplant, the relative risk of having a myocardial infarction was 0.89 for deceased-donor organ recipients and 0.69 for living-donor recipients compared with patients on the waiting list.10
INFECTIOUS COMPLICATIONS IN KIDNEY RECIPIENTS
Kidney recipients are prone to many common and uncommon infections (Table 3). All potential recipients are tested pretransplant for hepatitis B, hepatitis C, human immunodeficiency virus, syphilis, and tuberculosis. A positive result does not necessarily rule out transplant.
The following viral serology tests are also done before transplant:
Epstein-Barr virus (antibodies are positive in about 90% of adults)
CMV (about 70% of adults are seropositive)
Varicella zoster (seronegative patients should be given live-attenuated varicella vaccine).
Risk of transmission of these viruses relates to the serostatus of the donor and recipient before transplant. If a donor is positive for viral antibodies but the recipient is not (a so-called “mismatch”), risk is higher after transplant.
Hepatitis C
Patients with hepatitis C fare better if they get a transplant than if they remain on dialysis, although their posttransplant course is worse compared with transplant patients who do not have hepatitis. Some patients develop accelerated liver disease after kidney transplant. Hepatitis C-related kidney disease—membranous proliferative glomerulonephritis—also occurs, as do comorbidities such as diabetes.
Careful evaluation is warranted before transplant, including liver imaging, alpha-fetoprotein testing, and liver biopsy to evaluate for hepatocellular carcinoma. A patient with advanced fibrosis or cirrhosis may not be a candidate for kidney transplant alone but could possibly receive a combined kidney and liver transplant.
There is a need to determine the best time to treat hepatitis C infection. Patients with advanced liver disease or hepatitis C-related kidney disease would likely benefit from early treatment. However, delaying treatment could shorten the wait time for a deceased-donor organ positive for hepatitis C. Transplant candidates with active hepatitis C are uniquely considered to accept hepatitis C-positive kidneys, which are often discarded, and may only wait weeks for such a transplant. The shortened kidney survival associated with a hepatitis C-positive kidney may no longer be true with the new antiviral hepatitis C therapy, which has been shown to be effective post-transplant.
Hepatitis B
No cure is available for hepatitis B infection, but it can be well controlled with antiviral therapy. Patients with hepatitis B infection may be candidates for transplant, but they should be stable on antiviral therapy (lamivudine, entecavir, or tenofovir) to eliminate the viral load before transplant, and therapy should be continued afterward. Liver imaging, alpha-fetoprotein levels, and biopsy are recommended for evaluation. All hepatitis B- negative patients should be vaccinated before transplant.
Organs from living or deceased donors that test positive for hepatitis B core antibody, indicating prior exposure, can be considered for transplant in a patient who tests positive for hepatitis B surface antibody, indicating successful vaccination or prior exposure in the recipient. But donors must have negative surface antigen and polymerase chain reaction (PCR) tests that indicate no active hepatitis B infection.
Cytomegalovirus
CMV typically does not appear until prophylactic therapy is stopped. Classic symptoms are fever, leukopenia, and diarrhea. Infection can involve any organ, and patients may present with hepatitis, pancreatitis or, less commonly, pneumonitis.
Patients who are negative for CMV before transplant and receive a donor-positive organ are at the highest risk. Patients who are CMV IgG-positive are considered to be at intermediate risk, regardless of the donor status. Patients who are negative for CMV and receive a donor-negative organ are at the lowest risk and do not need prophylaxis with valganciclovir.
CMV infection is diagnosed by PCR testing of the blood or immunostaining in tissue biopsy. Occasionally, blood testing is negative in the face of tissue-based disease.
BK virus
BK is a polyoma virus and a common virus associated with kidney transplant. Viremia is seen in about 18% of patients, whereas actual kidney disease associated with a higher level of virus is seen in fewer than 10% of patients. Most people are exposed to BK virus, often in childhood, and it can remain indolent in the bladder and uroepithelium.
Patients can develop BK nephropathy after exposure to transplant immunosuppression.11 Posttransplant monitoring protocols typically include PCR testing for BK virus at 1, 3, 6, and 12 months. No agent has been identified to specifically treat BK virus. The general strategy is to minimize immunosuppressive therapy by reducing or eliminating mycophenolate mofetil. Fortunately, BK virus does not tend to recur, and patients can have a low-level viremia (< 10,000 copies/mL) persisting over months or even years but often without clinical consequences.
The appearance of BK virus on biopsy can mimic acute rejection. Before BK viral nephropathy was a recognized entity, patients would have been diagnosed with acute rejection and may have been put on high-dose steroids, which would have worsened the BK infection.
Posttransplant lymphoproliferative disorder
Posttransplant lymphoproliferative disorder is most often associated with Epstein-Barr virus and usually involves a large, diffuse B-cell lymphoma. Burkitt lymphoma and plasma cell neoplasms also can occur less commonly.
The condition is about 30 times more common in patients after transplant than in the general population, and it is the third most common malignancy in transplant patients after skin and cervical cancers. About 80% of the cases occur early after transplant, within the first year.
Patients typically have a marked elevation in viral load of Epstein-Barr virus, although a negative viral load does not rule it out. A patient who is serologically negative for Epstein-Barr virus receiving a donor-positive kidney is at highest risk; this situation is most often seen in the pediatric population. Potent induction therapies (eg, antilymphocyte antibody therapy) are also associated with posttransplant lymphoproliferative disorder.
Patients typically present with fever of unknown origin with no localizing signs or symptoms. Mass lesions can be challenging to find; positron emission tomography may be helpful. The culprit is usually a focal mass, ulcer (especially in the gastrointestinal tract), or infiltrate (commonly localized to the allograft). Multifocal or disseminated disease can also occur, including lymphoma or with central nervous system, gastrointestinal, or pulmonary involvement.
Biopsy of the affected site is required for histopathology and Epstein-Barr virus markers. PCR blood testing is often positive for Epstein-Barr virus.
Typical antiviral therapy does not eliminate Epstein-Barr virus. In early polyclonal viral proliferation, the first goal is to reduce immunosuppressive therapy. Rituximab alone may also help in polymorphic cases. With disease that is clearly monomorphic and has transformed to a true malignancy, cytotoxic chemotherapy is also required. “R-CHOP,” a combination therapy consisting of rituximab with cyclophosphamide, doxorubicin, vincristine, and prednisone, is usually used. Radiation therapy may help in some cases.
Cryptococcal infection
Previously seen in patients with acquired immune deficiency syndrome, cryptococcal infection is now most commonly encountered in patients with solid-organ transplants. Vilchez et al12 found a 1% incidence in a series of more than 5,000 patients who had received an organ transplant.
Immunosuppression likely conveys risk, but because cryptococcal infection is acquired, environmental exposure also plays a role. It tends to appear more than 6 months after transplant, indicating that its cause is a primary infection by spore inhalation rather than by reactivation or transmission from the donor organ.13 Bird exposure is a risk factor for cryptococcal infection. One case identified the same strain of Cryptococcus in a kidney transplant recipient and the family’s pet cockatoo.14
Cryptococcal infection typically starts as pneumonia, which may be subclinical. The infection can then disseminate, with meningitis presenting with headache and mental status changes being the most concerning complication. The death rate is about 50% in most series of patients with meningitis. Skin and soft-tissue manifestations may also occur in 10% to 15% of cases and can be nodular, ulcerative, or cellulitic.
More than 75% of fungal infections requiring hospitalization in US patients who have undergone transplant are attributed to either Candida, Aspergillus, or Cryptococcus species.15 Risk of fungal infection is increased with diabetes, duration of pretransplant dialysis, tacrolimus therapy, or rejection treatment.
- Wolfe RA, Ashby VB, Milford EL, et al. Comparison of mortality in all patients on dialysis, patients on dialysis awaiting transplantation, and recipients of a first cadaveric transplant. N Engl J Med 1999; 341:1725–1730.
- Kasiske BL, Klinger D. Cigarette smoking in renal transplant recipients. J Am Soc Nephrol 2000; 11:753–759.
- United Network for Organ Sharing. Transplant trends. https://transplantpro.org/technology/transplant-trends/#waitlists_by_organ. Accessed December 13, 2017.
- Meier-Kriesche HU, Kaplan B. Waiting time on dialysis as the strongest modifiable risk factor for renal transplant outcomes: a paired donor kidney analysis. Transplantation 2002; 74:1377–1381.
- Ojo AO, Hanson JA, Meier-Kriesche H, et al. Survival in recipients of marginal cadaveric donor kidneys compared with other recipients and wait-listed transplant candidates. J Am Soc Nephrol 2001; 12:589–597.
- Alonso P. Sanchez-Lazaro I, Almenar L, et al. Use of a “CNI holidays” strategy in acute renal dysfunction late after heart transplant. Report of two cases. Heart Int 2014; 9:74–77.
- Cantarovich M, Metrakos P, Giannetti N, Cecere R, Barkun J, Tchervenkov J. Anti-CD25 monoclonal antibody coverage allows for calcineurin inhibitor “holiday” in solid organ transplant patients with acute renal dysfunction. Transplantation 2002; 73:1169–1172.
- Brennan DC, Daller JA, Lake KD, Cibrik D, Del Castillo D; Thymoglobulin Induction Study Group. Rabbit antithymocyte globulin versus basiliximab in renal transplantation. N Engl J Med 2006; 355:1967–1977.
- McCullough PA. Evaluation and treatment of coronary artery disease in patients with end-stage renal disease. Kidney Int 2005; 67:S51–S58.
- Kasiske BL, Maclean JR, Snyder JJ. Acute myocardial infarction and kidney transplantation. J Am Soc Nephrol 2006; 17:900–907.
- Bohl DL, Storch GA, Ryschkewitsch C, et al. Donor origin of BK virus in renal transplantation and role of HLA C7 in susceptibility to sustained BK viremia. Am J Transplant 2005; 5:2213–2221.
- Vilchez RA, Fung J, Kusne S. Cryptococcosis in organ transplant recipients: an overview. Am J Transplant 2002; 2:575–580.
- Vilchez R, Shapiro R, McCurry K, et al. Longitudinal study of cryptococcosis in adult solid-organ transplant recipients. Transpl Int 2003; 16:336–340.
- Nosanchuk JD, Shoham S, Fries BC, Shapiro DS, Levitz SM, Casadevall A. Evidence of zoonotic transmission of Cryptococcus neoformans from a pet cockatoo to an immunocompromised patient. Ann Intern Med 2000; 132:205–208.
- Abbott KC, Hypolite I, Poropatich RK, et al. Hospitalizations for fungal infections after renal transplantation in the United States. Transpl Infect Dis 2001; 3:203–211.
Much has improved in renal transplantation over the past 20 years. The focus has shifted to using stronger immunotherapy rather than trying to minimize it. There has been increasing recognition of infection and ways to prevent and treat it. Induction therapy now has greater emphasis so that maintenance therapy can be eased, with the aim of reducing long-term toxicity. Perhaps the biggest change is the practice of screening for donor-specific antibodies at the time of transplant so that predictable problems can be prevented or better handled if they occur. Such advances have helped patients directly and by extending the life of their transplanted organs.
LONGER SURVIVAL
As early as the 1990s, it was recognized that kidney transplant offers a survival advantage for patients with end-stage renal disease over maintenance on dialysis.1 Although the risk of death is higher immediately after transplant, within a few months it becomes much lower than for patients on dialysis. Survival varies according to the health of the patient and the quality of the transplanted organ.
In general, patients who obtain the greatest benefit from transplants in terms of years of life gained are those with diabetes, especially those who are younger. Those ages 20 to 39 live about 8 years on dialysis vs 25 years after transplant.
CONTRAINDICATIONS TO TRANSPLANT
There are multiple contraindications to a solitary kidney transplant (Table 1), including smoking. Most transplant centers require that smokers quit before transplant. Long-standing smokers almost double their risk of a cardiac event after transplant and double their rate of malignancy. Active smoking at the time of transplant is associated with twice the risk of death by 10 years after transplant compared with that of nonsmokers.2 Cotinine testing can detect whether a patient is an active smoker.
WAITING-LIST CONSIDERATIONS
Organs are scarce
The number of patients on the kidney waiting list has increased rapidly in the last few decades, while the number of transplants performed each year has remained about the same. In 2016, about 100,000 patients were on the list, but only about 19,000 transplants were performed.3 Wait times, especially for deceased-donor organs, have increased to about 6 years, varying by blood type and geographic region.
Waiting-list placement
Placement on the waiting list for a deceased-donor kidney transplant occurs when a patient has an estimated glomerular filtration rate (GFR) of 20 mL/min/1.73 m2 or less, although referral to the list can be made earlier. Early listing remains advantageous, as total time on the list will be counted before starting dialysis. “Preemptive transplant” means the patient had no dialysis before transplant; this applies to about 10% of transplant recipients. These patients tend to fare the best and are usually recipients of a living-donor organ.
Most patients do not receive a transplant until the GFR is less than 15 mL/min/1.73 m2.
Since 2014, wait time has been measured from the beginning of dialysis rather than the date of waiting-list placement in patients who are listed after starting dialysis therapy. This approach is more fair but sometimes introduces problems. A patient who did not previously know about the list may suddenly jump to the head of the line after 10 years of dialysis, by which time comorbidities associated with long-term dialysis make the patient less likely to gain as much benefit from a transplant as people lower on the list. Time on dialysis, or “dialysis vintage,” predicts patient and kidney survival after transplant, with reduced survival associated with increasing time on dialysis.4
Shorter wait for a suboptimal kidney
The aging population has increased the number of older patients being listed for transplant, presenting multiple challenges. Patients age 65 or older have a 50% chance of dying before they receive a transplant during a 5-year wait.
A patient may shorten the wait by joining the list for a suboptimal organ. All deceased-donor organs are given a Kidney Donor Profile Index score, which predicts the longevity of an organ after transplant. The score is determined by donor age, kidney function based on the serum creatinine at the time of death, and other donor factors.
A kidney with a score higher than 85% is likely to function longer than only 15% of available kidneys. Patients who receive a kidney with that score have a longer period of risk of death soon after transplant and a slightly higher risk of death in the long term than patients who receive a healthier kidney, although on average they still do better than patients on dialysis.5
Older patients should be encouraged to sign up for both the regular waiting list and the suboptimal kidney waiting list to reduce the risk of dying before they get a kidney.
LIVING-DONOR ORGAN TRANSPLANT
Many advantages
Living-donor organ transplant is associated with a better survival rate than deceased-donor organ transplant, and the advantage becomes greater over time. At 1 year, patient survival is more than 90% in both groups, but by 5 years about 80% of patients with a living-donor organ are still alive vs only about 65% of patients with a deceased-donor organ.
The waiting time for a living-donor transplant may be only weeks to months, rather than years. Because increasing time on dialysis predicts worse patient and graft survival after transplant, the shorter wait time is a big advantage. In addition, because the donor and recipient are typically in adjacent operating rooms, the organ sustains less ischemic damage. In general, the kidney quality is better from healthy donors, resulting in superior function early on and longer graft survival by an average of 4 years. If the living donor is related to the recipient, human leukocyte antigen matching also tends to be better and predicts better outcomes.
Special challenges
Opting for a living-donor organ also entails special challenges. In addition to the ethical issues surrounding living-donor organ donation, an appropriate donor must be found. Donors must be highly motivated and pass physical, laboratory, and psychological evaluations.
For older patients, if the donor is a spouse or close friend, he or she is also likely to be older, making the organ less viable than one from a younger person. Even an adult child may not be an ideal donor if there is a family propensity to kidney disease, such as diabetic nephropathy. No test is available to determine the risk for future diabetes, but it is known to run in families.
POTENT IMMUNOSUPPRESSION
Induction therapy
Induction therapy with antithymocyte globulin or basiliximab provides intense immunosuppression to prevent acute rejection during the early posttransplant period.
Antithymocyte globulin is a potent agent that contains antibodies directed at T cells, B cells, neutrophils, platelets, adhesion molecules, and complement. It binds T cells and removes them from circulation by opsonization in splenic and lymphoid tissue. The immunosuppressive effect is sustained for at least 2 to 3 months after a series of injections (dosage 1.5 mg/kg/day, usually for 4 to 10 doses). Antithymocyte globulin is also used to treat acute rejection, especially high-grade rejection for which steroid therapy is likely to be insufficient.
Basiliximab consists of antibodies to the interleukin 2 (IL-2) receptor of T cells. Binding to T cells prevents their activation rather than removing them from circulation. The drug prevents rejection, with 30% relative reduction in early studies compared with placebo. However, it is ineffective in reversing established rejection. Dosage is 20 mg at day 0 and day 4, which provides receptor saturation for 30 to 45 days.
Basiliximab is also sometimes used off-label for patients who need to discontinue a calcineurin inhibitor (ie, tacrolimus or cyclosporine). In such cases, normal therapy is put on hold while basiliximab is given for 1 or 2 doses. Case series have been reported for this use, particularly for patients with a heart and liver transplant who develop acute kidney injury while hospitalized.6,7
Antithymocyte globulin is more effective but also more risky. Brennan et al8 randomized 278 transplant recipients to either antithymocyte globulin or basiliximab. Patients in the antithymocyte globulin group had a 16% rejection rate vs 26% in the basiliximab group.
Antithymocyte globulin therapy is associated with multiple adverse effects, including fever and chills, pulmonary edema, and long-standing immunosuppressive effects such as increased risk of lymphoma and cytomegalovirus (CMV) infection. Basiliximab side-effect profiles are similar to those of placebo.
Maintenance therapy
The calcineurin inhibitors cyclosporine and tacrolimus remain the standard of care in kidney transplant despite multiple drug interactions and side effects that include renal toxicity and fibrosis. Cyclosporine and tacrolimus both bind intracellular immunophilins and thereby prevent transcription of IL-2 and production of T cells. The drugs work similarly but have different binding sites. Cyclosporine has largely been replaced by tacrolimus because its reliability of dosing and higher potency are associated with lower rejection rates.
Tacrolimus is typically given twice daily (1–6 mg/dose). Twelve-hour trough levels are followed (target: 8–12 ng/mL early on, then 5–8 ng/mL after 3 months posttransplant). Side effects include hypertension and hypercholesterolemia, but less so than with cyclosporine. On the other hand, hyperglycemia tends to be worse with tacrolimus than with cyclosporine, and combining tacrolimus with steroids frequently leads to diabetes. Tacrolimus can also cause acute and chronic renal failure, especially at high drug levels, as well as neurotoxicity, tremors, and hair loss.
Cyclosporine, tacrolimus, and sirolimus (not a calcineurin inhibitor) are metabolized through the same cytochrome P450 pathway (CYP3A4), so they have common drug interactions (Table 2).
Mycophenolate mofetil is typically used as an adjunct therapy (500–1,000 mg twice daily). It is also used for other kidney diseases before transplant, including lupus nephritis. Transplanted kidney rejection rates with mycophenolate mofetil with steroids are about 40%, so the drug is not potent enough to be used without a calcineurin inhibitor.
Side effects include gastrointestinal toxicity in up to 20% of patients, and leukopenia, which is associated with viral infections.
CORONARY ARTERY DISEASE IS COMMON WITH DIALYSIS
Coronary artery disease is highly associated with end-stage kidney disease and occurs in as many as 85% of older patients with diabetes on dialysis. Although patients with end-stage kidney disease tend to have more numerous and severe atherosclerotic lesions compared with the general population, justifying aggressive management, cardiac care tends to be conservative in patients on dialysis.9
Death from acute myocardial infarction occurs in about 20% to 30% of patients on dialysis vs about 2% of patients with normal renal function. Five years after myocardial infarction, survival is only about 30% in patients on dialysis.9
There are many explanations for excess coronary artery disease in patients on dialysis. In addition to the traditional cardiovascular risk factors of diabetes, hypertension, and preexisting coronary artery disease, patients are in a proinflammatory uremic state and have high levels of phosphorus and fibroblast growth factor 23 that contribute to vascular calcification. Almost all patients have high homocysteine levels and hemodynamic instability, particularly if they are on hemodialysis.
Pretransplant evaluation for heart disease
Patients on the kidney transplant waiting list are screened aggressively for heart disease. A history of myocardial infarction usually results in removal from the list. All patients have an initial electrocardiogram and echocardiogram. Thallium or echocardiographic stress testing is used for patients who are age 50 and older, have diabetes, or have had dialysis for many years. Patients with evidence of ischemia undergo catheterization.
Patients are also screened with computed tomography before transplant. Because the kidney is typically anastomosed to the iliac artery and vein, heavy calcification of the iliac artery can make the procedure too difficult to perform.
Reduced long-term risk of myocardial infarction after transplant
Kasiske et al10 analyzed data from more than 50,000 patients from the US Renal Data System and found that, for about the first year after transplant, patients who underwent kidney transplant were more likely to have a myocardial infarction than those on dialysis. After that, they fared better than patients who remained on dialysis. Those with a living-donor transplant were less likely at all times to have a myocardial infarction than those with a deceased-donor transplant. By 3 years after transplant, the relative risk of having a myocardial infarction was 0.89 for deceased-donor organ recipients and 0.69 for living-donor recipients compared with patients on the waiting list.10
INFECTIOUS COMPLICATIONS IN KIDNEY RECIPIENTS
Kidney recipients are prone to many common and uncommon infections (Table 3). All potential recipients are tested pretransplant for hepatitis B, hepatitis C, human immunodeficiency virus, syphilis, and tuberculosis. A positive result does not necessarily rule out transplant.
The following viral serology tests are also done before transplant:
Epstein-Barr virus (antibodies are positive in about 90% of adults)
CMV (about 70% of adults are seropositive)
Varicella zoster (seronegative patients should be given live-attenuated varicella vaccine).
Risk of transmission of these viruses relates to the serostatus of the donor and recipient before transplant. If a donor is positive for viral antibodies but the recipient is not (a so-called “mismatch”), risk is higher after transplant.
Hepatitis C
Patients with hepatitis C fare better if they get a transplant than if they remain on dialysis, although their posttransplant course is worse compared with transplant patients who do not have hepatitis. Some patients develop accelerated liver disease after kidney transplant. Hepatitis C-related kidney disease—membranous proliferative glomerulonephritis—also occurs, as do comorbidities such as diabetes.
Careful evaluation is warranted before transplant, including liver imaging, alpha-fetoprotein testing, and liver biopsy to evaluate for hepatocellular carcinoma. A patient with advanced fibrosis or cirrhosis may not be a candidate for kidney transplant alone but could possibly receive a combined kidney and liver transplant.
There is a need to determine the best time to treat hepatitis C infection. Patients with advanced liver disease or hepatitis C-related kidney disease would likely benefit from early treatment. However, delaying treatment could shorten the wait time for a deceased-donor organ positive for hepatitis C. Transplant candidates with active hepatitis C are uniquely considered to accept hepatitis C-positive kidneys, which are often discarded, and may only wait weeks for such a transplant. The shortened kidney survival associated with a hepatitis C-positive kidney may no longer be true with the new antiviral hepatitis C therapy, which has been shown to be effective post-transplant.
Hepatitis B
No cure is available for hepatitis B infection, but it can be well controlled with antiviral therapy. Patients with hepatitis B infection may be candidates for transplant, but they should be stable on antiviral therapy (lamivudine, entecavir, or tenofovir) to eliminate the viral load before transplant, and therapy should be continued afterward. Liver imaging, alpha-fetoprotein levels, and biopsy are recommended for evaluation. All hepatitis B- negative patients should be vaccinated before transplant.
Organs from living or deceased donors that test positive for hepatitis B core antibody, indicating prior exposure, can be considered for transplant in a patient who tests positive for hepatitis B surface antibody, indicating successful vaccination or prior exposure in the recipient. But donors must have negative surface antigen and polymerase chain reaction (PCR) tests that indicate no active hepatitis B infection.
Cytomegalovirus
CMV typically does not appear until prophylactic therapy is stopped. Classic symptoms are fever, leukopenia, and diarrhea. Infection can involve any organ, and patients may present with hepatitis, pancreatitis or, less commonly, pneumonitis.
Patients who are negative for CMV before transplant and receive a donor-positive organ are at the highest risk. Patients who are CMV IgG-positive are considered to be at intermediate risk, regardless of the donor status. Patients who are negative for CMV and receive a donor-negative organ are at the lowest risk and do not need prophylaxis with valganciclovir.
CMV infection is diagnosed by PCR testing of the blood or immunostaining in tissue biopsy. Occasionally, blood testing is negative in the face of tissue-based disease.
BK virus
BK is a polyoma virus and a common virus associated with kidney transplant. Viremia is seen in about 18% of patients, whereas actual kidney disease associated with a higher level of virus is seen in fewer than 10% of patients. Most people are exposed to BK virus, often in childhood, and it can remain indolent in the bladder and uroepithelium.
Patients can develop BK nephropathy after exposure to transplant immunosuppression.11 Posttransplant monitoring protocols typically include PCR testing for BK virus at 1, 3, 6, and 12 months. No agent has been identified to specifically treat BK virus. The general strategy is to minimize immunosuppressive therapy by reducing or eliminating mycophenolate mofetil. Fortunately, BK virus does not tend to recur, and patients can have a low-level viremia (< 10,000 copies/mL) persisting over months or even years but often without clinical consequences.
The appearance of BK virus on biopsy can mimic acute rejection. Before BK viral nephropathy was a recognized entity, patients would have been diagnosed with acute rejection and may have been put on high-dose steroids, which would have worsened the BK infection.
Posttransplant lymphoproliferative disorder
Posttransplant lymphoproliferative disorder is most often associated with Epstein-Barr virus and usually involves a large, diffuse B-cell lymphoma. Burkitt lymphoma and plasma cell neoplasms also can occur less commonly.
The condition is about 30 times more common in patients after transplant than in the general population, and it is the third most common malignancy in transplant patients after skin and cervical cancers. About 80% of the cases occur early after transplant, within the first year.
Patients typically have a marked elevation in viral load of Epstein-Barr virus, although a negative viral load does not rule it out. A patient who is serologically negative for Epstein-Barr virus receiving a donor-positive kidney is at highest risk; this situation is most often seen in the pediatric population. Potent induction therapies (eg, antilymphocyte antibody therapy) are also associated with posttransplant lymphoproliferative disorder.
Patients typically present with fever of unknown origin with no localizing signs or symptoms. Mass lesions can be challenging to find; positron emission tomography may be helpful. The culprit is usually a focal mass, ulcer (especially in the gastrointestinal tract), or infiltrate (commonly localized to the allograft). Multifocal or disseminated disease can also occur, including lymphoma or with central nervous system, gastrointestinal, or pulmonary involvement.
Biopsy of the affected site is required for histopathology and Epstein-Barr virus markers. PCR blood testing is often positive for Epstein-Barr virus.
Typical antiviral therapy does not eliminate Epstein-Barr virus. In early polyclonal viral proliferation, the first goal is to reduce immunosuppressive therapy. Rituximab alone may also help in polymorphic cases. With disease that is clearly monomorphic and has transformed to a true malignancy, cytotoxic chemotherapy is also required. “R-CHOP,” a combination therapy consisting of rituximab with cyclophosphamide, doxorubicin, vincristine, and prednisone, is usually used. Radiation therapy may help in some cases.
Cryptococcal infection
Previously seen in patients with acquired immune deficiency syndrome, cryptococcal infection is now most commonly encountered in patients with solid-organ transplants. Vilchez et al12 found a 1% incidence in a series of more than 5,000 patients who had received an organ transplant.
Immunosuppression likely conveys risk, but because cryptococcal infection is acquired, environmental exposure also plays a role. It tends to appear more than 6 months after transplant, indicating that its cause is a primary infection by spore inhalation rather than by reactivation or transmission from the donor organ.13 Bird exposure is a risk factor for cryptococcal infection. One case identified the same strain of Cryptococcus in a kidney transplant recipient and the family’s pet cockatoo.14
Cryptococcal infection typically starts as pneumonia, which may be subclinical. The infection can then disseminate, with meningitis presenting with headache and mental status changes being the most concerning complication. The death rate is about 50% in most series of patients with meningitis. Skin and soft-tissue manifestations may also occur in 10% to 15% of cases and can be nodular, ulcerative, or cellulitic.
More than 75% of fungal infections requiring hospitalization in US patients who have undergone transplant are attributed to either Candida, Aspergillus, or Cryptococcus species.15 Risk of fungal infection is increased with diabetes, duration of pretransplant dialysis, tacrolimus therapy, or rejection treatment.
Much has improved in renal transplantation over the past 20 years. The focus has shifted to using stronger immunotherapy rather than trying to minimize it. There has been increasing recognition of infection and ways to prevent and treat it. Induction therapy now has greater emphasis so that maintenance therapy can be eased, with the aim of reducing long-term toxicity. Perhaps the biggest change is the practice of screening for donor-specific antibodies at the time of transplant so that predictable problems can be prevented or better handled if they occur. Such advances have helped patients directly and by extending the life of their transplanted organs.
LONGER SURVIVAL
As early as the 1990s, it was recognized that kidney transplant offers a survival advantage for patients with end-stage renal disease over maintenance on dialysis.1 Although the risk of death is higher immediately after transplant, within a few months it becomes much lower than for patients on dialysis. Survival varies according to the health of the patient and the quality of the transplanted organ.
In general, patients who obtain the greatest benefit from transplants in terms of years of life gained are those with diabetes, especially those who are younger. Those ages 20 to 39 live about 8 years on dialysis vs 25 years after transplant.
CONTRAINDICATIONS TO TRANSPLANT
There are multiple contraindications to a solitary kidney transplant (Table 1), including smoking. Most transplant centers require that smokers quit before transplant. Long-standing smokers almost double their risk of a cardiac event after transplant and double their rate of malignancy. Active smoking at the time of transplant is associated with twice the risk of death by 10 years after transplant compared with that of nonsmokers.2 Cotinine testing can detect whether a patient is an active smoker.
WAITING-LIST CONSIDERATIONS
Organs are scarce
The number of patients on the kidney waiting list has increased rapidly in the last few decades, while the number of transplants performed each year has remained about the same. In 2016, about 100,000 patients were on the list, but only about 19,000 transplants were performed.3 Wait times, especially for deceased-donor organs, have increased to about 6 years, varying by blood type and geographic region.
Waiting-list placement
Placement on the waiting list for a deceased-donor kidney transplant occurs when a patient has an estimated glomerular filtration rate (GFR) of 20 mL/min/1.73 m2 or less, although referral to the list can be made earlier. Early listing remains advantageous, as total time on the list will be counted before starting dialysis. “Preemptive transplant” means the patient had no dialysis before transplant; this applies to about 10% of transplant recipients. These patients tend to fare the best and are usually recipients of a living-donor organ.
Most patients do not receive a transplant until the GFR is less than 15 mL/min/1.73 m2.
Since 2014, wait time has been measured from the beginning of dialysis rather than the date of waiting-list placement in patients who are listed after starting dialysis therapy. This approach is more fair but sometimes introduces problems. A patient who did not previously know about the list may suddenly jump to the head of the line after 10 years of dialysis, by which time comorbidities associated with long-term dialysis make the patient less likely to gain as much benefit from a transplant as people lower on the list. Time on dialysis, or “dialysis vintage,” predicts patient and kidney survival after transplant, with reduced survival associated with increasing time on dialysis.4
Shorter wait for a suboptimal kidney
The aging population has increased the number of older patients being listed for transplant, presenting multiple challenges. Patients age 65 or older have a 50% chance of dying before they receive a transplant during a 5-year wait.
A patient may shorten the wait by joining the list for a suboptimal organ. All deceased-donor organs are given a Kidney Donor Profile Index score, which predicts the longevity of an organ after transplant. The score is determined by donor age, kidney function based on the serum creatinine at the time of death, and other donor factors.
A kidney with a score higher than 85% is likely to function longer than only 15% of available kidneys. Patients who receive a kidney with that score have a longer period of risk of death soon after transplant and a slightly higher risk of death in the long term than patients who receive a healthier kidney, although on average they still do better than patients on dialysis.5
Older patients should be encouraged to sign up for both the regular waiting list and the suboptimal kidney waiting list to reduce the risk of dying before they get a kidney.
LIVING-DONOR ORGAN TRANSPLANT
Many advantages
Living-donor organ transplant is associated with a better survival rate than deceased-donor organ transplant, and the advantage becomes greater over time. At 1 year, patient survival is more than 90% in both groups, but by 5 years about 80% of patients with a living-donor organ are still alive vs only about 65% of patients with a deceased-donor organ.
The waiting time for a living-donor transplant may be only weeks to months, rather than years. Because increasing time on dialysis predicts worse patient and graft survival after transplant, the shorter wait time is a big advantage. In addition, because the donor and recipient are typically in adjacent operating rooms, the organ sustains less ischemic damage. In general, the kidney quality is better from healthy donors, resulting in superior function early on and longer graft survival by an average of 4 years. If the living donor is related to the recipient, human leukocyte antigen matching also tends to be better and predicts better outcomes.
Special challenges
Opting for a living-donor organ also entails special challenges. In addition to the ethical issues surrounding living-donor organ donation, an appropriate donor must be found. Donors must be highly motivated and pass physical, laboratory, and psychological evaluations.
For older patients, if the donor is a spouse or close friend, he or she is also likely to be older, making the organ less viable than one from a younger person. Even an adult child may not be an ideal donor if there is a family propensity to kidney disease, such as diabetic nephropathy. No test is available to determine the risk for future diabetes, but it is known to run in families.
POTENT IMMUNOSUPPRESSION
Induction therapy
Induction therapy with antithymocyte globulin or basiliximab provides intense immunosuppression to prevent acute rejection during the early posttransplant period.
Antithymocyte globulin is a potent agent that contains antibodies directed at T cells, B cells, neutrophils, platelets, adhesion molecules, and complement. It binds T cells and removes them from circulation by opsonization in splenic and lymphoid tissue. The immunosuppressive effect is sustained for at least 2 to 3 months after a series of injections (dosage 1.5 mg/kg/day, usually for 4 to 10 doses). Antithymocyte globulin is also used to treat acute rejection, especially high-grade rejection for which steroid therapy is likely to be insufficient.
Basiliximab consists of antibodies to the interleukin 2 (IL-2) receptor of T cells. Binding to T cells prevents their activation rather than removing them from circulation. The drug prevents rejection, with 30% relative reduction in early studies compared with placebo. However, it is ineffective in reversing established rejection. Dosage is 20 mg at day 0 and day 4, which provides receptor saturation for 30 to 45 days.
Basiliximab is also sometimes used off-label for patients who need to discontinue a calcineurin inhibitor (ie, tacrolimus or cyclosporine). In such cases, normal therapy is put on hold while basiliximab is given for 1 or 2 doses. Case series have been reported for this use, particularly for patients with a heart and liver transplant who develop acute kidney injury while hospitalized.6,7
Antithymocyte globulin is more effective but also more risky. Brennan et al8 randomized 278 transplant recipients to either antithymocyte globulin or basiliximab. Patients in the antithymocyte globulin group had a 16% rejection rate vs 26% in the basiliximab group.
Antithymocyte globulin therapy is associated with multiple adverse effects, including fever and chills, pulmonary edema, and long-standing immunosuppressive effects such as increased risk of lymphoma and cytomegalovirus (CMV) infection. Basiliximab side-effect profiles are similar to those of placebo.
Maintenance therapy
The calcineurin inhibitors cyclosporine and tacrolimus remain the standard of care in kidney transplant despite multiple drug interactions and side effects that include renal toxicity and fibrosis. Cyclosporine and tacrolimus both bind intracellular immunophilins and thereby prevent transcription of IL-2 and production of T cells. The drugs work similarly but have different binding sites. Cyclosporine has largely been replaced by tacrolimus because its reliability of dosing and higher potency are associated with lower rejection rates.
Tacrolimus is typically given twice daily (1–6 mg/dose). Twelve-hour trough levels are followed (target: 8–12 ng/mL early on, then 5–8 ng/mL after 3 months posttransplant). Side effects include hypertension and hypercholesterolemia, but less so than with cyclosporine. On the other hand, hyperglycemia tends to be worse with tacrolimus than with cyclosporine, and combining tacrolimus with steroids frequently leads to diabetes. Tacrolimus can also cause acute and chronic renal failure, especially at high drug levels, as well as neurotoxicity, tremors, and hair loss.
Cyclosporine, tacrolimus, and sirolimus (not a calcineurin inhibitor) are metabolized through the same cytochrome P450 pathway (CYP3A4), so they have common drug interactions (Table 2).
Mycophenolate mofetil is typically used as an adjunct therapy (500–1,000 mg twice daily). It is also used for other kidney diseases before transplant, including lupus nephritis. Transplanted kidney rejection rates with mycophenolate mofetil with steroids are about 40%, so the drug is not potent enough to be used without a calcineurin inhibitor.
Side effects include gastrointestinal toxicity in up to 20% of patients, and leukopenia, which is associated with viral infections.
CORONARY ARTERY DISEASE IS COMMON WITH DIALYSIS
Coronary artery disease is highly associated with end-stage kidney disease and occurs in as many as 85% of older patients with diabetes on dialysis. Although patients with end-stage kidney disease tend to have more numerous and severe atherosclerotic lesions compared with the general population, justifying aggressive management, cardiac care tends to be conservative in patients on dialysis.9
Death from acute myocardial infarction occurs in about 20% to 30% of patients on dialysis vs about 2% of patients with normal renal function. Five years after myocardial infarction, survival is only about 30% in patients on dialysis.9
There are many explanations for excess coronary artery disease in patients on dialysis. In addition to the traditional cardiovascular risk factors of diabetes, hypertension, and preexisting coronary artery disease, patients are in a proinflammatory uremic state and have high levels of phosphorus and fibroblast growth factor 23 that contribute to vascular calcification. Almost all patients have high homocysteine levels and hemodynamic instability, particularly if they are on hemodialysis.
Pretransplant evaluation for heart disease
Patients on the kidney transplant waiting list are screened aggressively for heart disease. A history of myocardial infarction usually results in removal from the list. All patients have an initial electrocardiogram and echocardiogram. Thallium or echocardiographic stress testing is used for patients who are age 50 and older, have diabetes, or have had dialysis for many years. Patients with evidence of ischemia undergo catheterization.
Patients are also screened with computed tomography before transplant. Because the kidney is typically anastomosed to the iliac artery and vein, heavy calcification of the iliac artery can make the procedure too difficult to perform.
Reduced long-term risk of myocardial infarction after transplant
Kasiske et al10 analyzed data from more than 50,000 patients from the US Renal Data System and found that, for about the first year after transplant, patients who underwent kidney transplant were more likely to have a myocardial infarction than those on dialysis. After that, they fared better than patients who remained on dialysis. Those with a living-donor transplant were less likely at all times to have a myocardial infarction than those with a deceased-donor transplant. By 3 years after transplant, the relative risk of having a myocardial infarction was 0.89 for deceased-donor organ recipients and 0.69 for living-donor recipients compared with patients on the waiting list.10
INFECTIOUS COMPLICATIONS IN KIDNEY RECIPIENTS
Kidney recipients are prone to many common and uncommon infections (Table 3). All potential recipients are tested pretransplant for hepatitis B, hepatitis C, human immunodeficiency virus, syphilis, and tuberculosis. A positive result does not necessarily rule out transplant.
The following viral serology tests are also done before transplant:
Epstein-Barr virus (antibodies are positive in about 90% of adults)
CMV (about 70% of adults are seropositive)
Varicella zoster (seronegative patients should be given live-attenuated varicella vaccine).
Risk of transmission of these viruses relates to the serostatus of the donor and recipient before transplant. If a donor is positive for viral antibodies but the recipient is not (a so-called “mismatch”), risk is higher after transplant.
Hepatitis C
Patients with hepatitis C fare better if they get a transplant than if they remain on dialysis, although their posttransplant course is worse compared with transplant patients who do not have hepatitis. Some patients develop accelerated liver disease after kidney transplant. Hepatitis C-related kidney disease—membranous proliferative glomerulonephritis—also occurs, as do comorbidities such as diabetes.
Careful evaluation is warranted before transplant, including liver imaging, alpha-fetoprotein testing, and liver biopsy to evaluate for hepatocellular carcinoma. A patient with advanced fibrosis or cirrhosis may not be a candidate for kidney transplant alone but could possibly receive a combined kidney and liver transplant.
There is a need to determine the best time to treat hepatitis C infection. Patients with advanced liver disease or hepatitis C-related kidney disease would likely benefit from early treatment. However, delaying treatment could shorten the wait time for a deceased-donor organ positive for hepatitis C. Transplant candidates with active hepatitis C are uniquely considered to accept hepatitis C-positive kidneys, which are often discarded, and may only wait weeks for such a transplant. The shortened kidney survival associated with a hepatitis C-positive kidney may no longer be true with the new antiviral hepatitis C therapy, which has been shown to be effective post-transplant.
Hepatitis B
No cure is available for hepatitis B infection, but it can be well controlled with antiviral therapy. Patients with hepatitis B infection may be candidates for transplant, but they should be stable on antiviral therapy (lamivudine, entecavir, or tenofovir) to eliminate the viral load before transplant, and therapy should be continued afterward. Liver imaging, alpha-fetoprotein levels, and biopsy are recommended for evaluation. All hepatitis B- negative patients should be vaccinated before transplant.
Organs from living or deceased donors that test positive for hepatitis B core antibody, indicating prior exposure, can be considered for transplant in a patient who tests positive for hepatitis B surface antibody, indicating successful vaccination or prior exposure in the recipient. But donors must have negative surface antigen and polymerase chain reaction (PCR) tests that indicate no active hepatitis B infection.
Cytomegalovirus
CMV typically does not appear until prophylactic therapy is stopped. Classic symptoms are fever, leukopenia, and diarrhea. Infection can involve any organ, and patients may present with hepatitis, pancreatitis or, less commonly, pneumonitis.
Patients who are negative for CMV before transplant and receive a donor-positive organ are at the highest risk. Patients who are CMV IgG-positive are considered to be at intermediate risk, regardless of the donor status. Patients who are negative for CMV and receive a donor-negative organ are at the lowest risk and do not need prophylaxis with valganciclovir.
CMV infection is diagnosed by PCR testing of the blood or immunostaining in tissue biopsy. Occasionally, blood testing is negative in the face of tissue-based disease.
BK virus
BK is a polyoma virus and a common virus associated with kidney transplant. Viremia is seen in about 18% of patients, whereas actual kidney disease associated with a higher level of virus is seen in fewer than 10% of patients. Most people are exposed to BK virus, often in childhood, and it can remain indolent in the bladder and uroepithelium.
Patients can develop BK nephropathy after exposure to transplant immunosuppression.11 Posttransplant monitoring protocols typically include PCR testing for BK virus at 1, 3, 6, and 12 months. No agent has been identified to specifically treat BK virus. The general strategy is to minimize immunosuppressive therapy by reducing or eliminating mycophenolate mofetil. Fortunately, BK virus does not tend to recur, and patients can have a low-level viremia (< 10,000 copies/mL) persisting over months or even years but often without clinical consequences.
The appearance of BK virus on biopsy can mimic acute rejection. Before BK viral nephropathy was a recognized entity, patients would have been diagnosed with acute rejection and may have been put on high-dose steroids, which would have worsened the BK infection.
Posttransplant lymphoproliferative disorder
Posttransplant lymphoproliferative disorder is most often associated with Epstein-Barr virus and usually involves a large, diffuse B-cell lymphoma. Burkitt lymphoma and plasma cell neoplasms also can occur less commonly.
The condition is about 30 times more common in patients after transplant than in the general population, and it is the third most common malignancy in transplant patients after skin and cervical cancers. About 80% of the cases occur early after transplant, within the first year.
Patients typically have a marked elevation in viral load of Epstein-Barr virus, although a negative viral load does not rule it out. A patient who is serologically negative for Epstein-Barr virus receiving a donor-positive kidney is at highest risk; this situation is most often seen in the pediatric population. Potent induction therapies (eg, antilymphocyte antibody therapy) are also associated with posttransplant lymphoproliferative disorder.
Patients typically present with fever of unknown origin with no localizing signs or symptoms. Mass lesions can be challenging to find; positron emission tomography may be helpful. The culprit is usually a focal mass, ulcer (especially in the gastrointestinal tract), or infiltrate (commonly localized to the allograft). Multifocal or disseminated disease can also occur, including lymphoma or with central nervous system, gastrointestinal, or pulmonary involvement.
Biopsy of the affected site is required for histopathology and Epstein-Barr virus markers. PCR blood testing is often positive for Epstein-Barr virus.
Typical antiviral therapy does not eliminate Epstein-Barr virus. In early polyclonal viral proliferation, the first goal is to reduce immunosuppressive therapy. Rituximab alone may also help in polymorphic cases. With disease that is clearly monomorphic and has transformed to a true malignancy, cytotoxic chemotherapy is also required. “R-CHOP,” a combination therapy consisting of rituximab with cyclophosphamide, doxorubicin, vincristine, and prednisone, is usually used. Radiation therapy may help in some cases.
Cryptococcal infection
Previously seen in patients with acquired immune deficiency syndrome, cryptococcal infection is now most commonly encountered in patients with solid-organ transplants. Vilchez et al12 found a 1% incidence in a series of more than 5,000 patients who had received an organ transplant.
Immunosuppression likely conveys risk, but because cryptococcal infection is acquired, environmental exposure also plays a role. It tends to appear more than 6 months after transplant, indicating that its cause is a primary infection by spore inhalation rather than by reactivation or transmission from the donor organ.13 Bird exposure is a risk factor for cryptococcal infection. One case identified the same strain of Cryptococcus in a kidney transplant recipient and the family’s pet cockatoo.14
Cryptococcal infection typically starts as pneumonia, which may be subclinical. The infection can then disseminate, with meningitis presenting with headache and mental status changes being the most concerning complication. The death rate is about 50% in most series of patients with meningitis. Skin and soft-tissue manifestations may also occur in 10% to 15% of cases and can be nodular, ulcerative, or cellulitic.
More than 75% of fungal infections requiring hospitalization in US patients who have undergone transplant are attributed to either Candida, Aspergillus, or Cryptococcus species.15 Risk of fungal infection is increased with diabetes, duration of pretransplant dialysis, tacrolimus therapy, or rejection treatment.
- Wolfe RA, Ashby VB, Milford EL, et al. Comparison of mortality in all patients on dialysis, patients on dialysis awaiting transplantation, and recipients of a first cadaveric transplant. N Engl J Med 1999; 341:1725–1730.
- Kasiske BL, Klinger D. Cigarette smoking in renal transplant recipients. J Am Soc Nephrol 2000; 11:753–759.
- United Network for Organ Sharing. Transplant trends. https://transplantpro.org/technology/transplant-trends/#waitlists_by_organ. Accessed December 13, 2017.
- Meier-Kriesche HU, Kaplan B. Waiting time on dialysis as the strongest modifiable risk factor for renal transplant outcomes: a paired donor kidney analysis. Transplantation 2002; 74:1377–1381.
- Ojo AO, Hanson JA, Meier-Kriesche H, et al. Survival in recipients of marginal cadaveric donor kidneys compared with other recipients and wait-listed transplant candidates. J Am Soc Nephrol 2001; 12:589–597.
- Alonso P. Sanchez-Lazaro I, Almenar L, et al. Use of a “CNI holidays” strategy in acute renal dysfunction late after heart transplant. Report of two cases. Heart Int 2014; 9:74–77.
- Cantarovich M, Metrakos P, Giannetti N, Cecere R, Barkun J, Tchervenkov J. Anti-CD25 monoclonal antibody coverage allows for calcineurin inhibitor “holiday” in solid organ transplant patients with acute renal dysfunction. Transplantation 2002; 73:1169–1172.
- Brennan DC, Daller JA, Lake KD, Cibrik D, Del Castillo D; Thymoglobulin Induction Study Group. Rabbit antithymocyte globulin versus basiliximab in renal transplantation. N Engl J Med 2006; 355:1967–1977.
- McCullough PA. Evaluation and treatment of coronary artery disease in patients with end-stage renal disease. Kidney Int 2005; 67:S51–S58.
- Kasiske BL, Maclean JR, Snyder JJ. Acute myocardial infarction and kidney transplantation. J Am Soc Nephrol 2006; 17:900–907.
- Bohl DL, Storch GA, Ryschkewitsch C, et al. Donor origin of BK virus in renal transplantation and role of HLA C7 in susceptibility to sustained BK viremia. Am J Transplant 2005; 5:2213–2221.
- Vilchez RA, Fung J, Kusne S. Cryptococcosis in organ transplant recipients: an overview. Am J Transplant 2002; 2:575–580.
- Vilchez R, Shapiro R, McCurry K, et al. Longitudinal study of cryptococcosis in adult solid-organ transplant recipients. Transpl Int 2003; 16:336–340.
- Nosanchuk JD, Shoham S, Fries BC, Shapiro DS, Levitz SM, Casadevall A. Evidence of zoonotic transmission of Cryptococcus neoformans from a pet cockatoo to an immunocompromised patient. Ann Intern Med 2000; 132:205–208.
- Abbott KC, Hypolite I, Poropatich RK, et al. Hospitalizations for fungal infections after renal transplantation in the United States. Transpl Infect Dis 2001; 3:203–211.
- Wolfe RA, Ashby VB, Milford EL, et al. Comparison of mortality in all patients on dialysis, patients on dialysis awaiting transplantation, and recipients of a first cadaveric transplant. N Engl J Med 1999; 341:1725–1730.
- Kasiske BL, Klinger D. Cigarette smoking in renal transplant recipients. J Am Soc Nephrol 2000; 11:753–759.
- United Network for Organ Sharing. Transplant trends. https://transplantpro.org/technology/transplant-trends/#waitlists_by_organ. Accessed December 13, 2017.
- Meier-Kriesche HU, Kaplan B. Waiting time on dialysis as the strongest modifiable risk factor for renal transplant outcomes: a paired donor kidney analysis. Transplantation 2002; 74:1377–1381.
- Ojo AO, Hanson JA, Meier-Kriesche H, et al. Survival in recipients of marginal cadaveric donor kidneys compared with other recipients and wait-listed transplant candidates. J Am Soc Nephrol 2001; 12:589–597.
- Alonso P. Sanchez-Lazaro I, Almenar L, et al. Use of a “CNI holidays” strategy in acute renal dysfunction late after heart transplant. Report of two cases. Heart Int 2014; 9:74–77.
- Cantarovich M, Metrakos P, Giannetti N, Cecere R, Barkun J, Tchervenkov J. Anti-CD25 monoclonal antibody coverage allows for calcineurin inhibitor “holiday” in solid organ transplant patients with acute renal dysfunction. Transplantation 2002; 73:1169–1172.
- Brennan DC, Daller JA, Lake KD, Cibrik D, Del Castillo D; Thymoglobulin Induction Study Group. Rabbit antithymocyte globulin versus basiliximab in renal transplantation. N Engl J Med 2006; 355:1967–1977.
- McCullough PA. Evaluation and treatment of coronary artery disease in patients with end-stage renal disease. Kidney Int 2005; 67:S51–S58.
- Kasiske BL, Maclean JR, Snyder JJ. Acute myocardial infarction and kidney transplantation. J Am Soc Nephrol 2006; 17:900–907.
- Bohl DL, Storch GA, Ryschkewitsch C, et al. Donor origin of BK virus in renal transplantation and role of HLA C7 in susceptibility to sustained BK viremia. Am J Transplant 2005; 5:2213–2221.
- Vilchez RA, Fung J, Kusne S. Cryptococcosis in organ transplant recipients: an overview. Am J Transplant 2002; 2:575–580.
- Vilchez R, Shapiro R, McCurry K, et al. Longitudinal study of cryptococcosis in adult solid-organ transplant recipients. Transpl Int 2003; 16:336–340.
- Nosanchuk JD, Shoham S, Fries BC, Shapiro DS, Levitz SM, Casadevall A. Evidence of zoonotic transmission of Cryptococcus neoformans from a pet cockatoo to an immunocompromised patient. Ann Intern Med 2000; 132:205–208.
- Abbott KC, Hypolite I, Poropatich RK, et al. Hospitalizations for fungal infections after renal transplantation in the United States. Transpl Infect Dis 2001; 3:203–211.
KEY POINTS
- Kidney transplant improves survival and long-term outcomes in patients with renal failure.
- Before transplant, patients should be carefully evaluated for cardiovascular and infectious disease risk.
- Potent immunosuppression is required to maintain a successful kidney transplant.
- After transplant, patients must be monitored for recurrent disease, side effects of immunosuppression, and opportunistic infections.
PCI for stable angina: A missed opportunity for shared decision-making
Multiple randomized controlled trials have compared percutaneous coronary intervention (PCI) vs optimal medical therapy for patients with chronic stable angina. All have consistently shown that PCI does not reduce the risk of death or even myocardial infarction (MI) but that it may relieve angina temporarily. Nevertheless, PCI is still commonly performed for patients with stable coronary disease, often in the absence of angina, and patients mistakenly believe the procedure is life-saving. Cardiologists may not be aware of patients’ misperceptions, or worse, may encourage them. In either case, if patients do not understand the benefits of the procedure, they cannot give informed consent.
This article reviews the pathophysiology of coronary artery disease, evidence from clinical trials of the value of PCI for chronic stable angina, patient and physician perceptions of PCI, and ways to promote patient-centered, shared decision-making.
CLINICAL CASE: EXERTIONAL ANGINA
While climbing 4 flights of stairs, a 55-year-old man noticed tightness in his chest, which lasted for 5 minutes and resolved spontaneously. Several weeks later, when visiting his primary care physician, he mentioned the episode. He had had no symptoms in the interim, but the physician ordered an exercise stress test.
Six minutes into a standard Bruce protocol, the patient experienced the same chest tightness, accompanied by 1-mm ST-segment depressions in leads II, III, and aVF. He was then referred to a cardiologist, who recommended catheterization.
Catheterization demonstrated a 95% stenosis of the right coronary artery with nonsignificant stenoses of the left anterior descending and circumflex arteries. A drug-eluting stent was placed in the right coronary artery, with no residual stenosis.
Did this intervention likely prevent an MI and perhaps save the man’s life?
HOW MYOCARDIAL INFARCTION HAPPENS
Understanding the pathogenesis of MI is critical to having realistic expectations of the benefits of stent placement.
Doctors often describe coronary atherosclerosis as a plumbing problem, where deposits of cholesterol and fat build up in arterial walls, clogging the pipes and eventually causing a heart attack. This analogy, which has been around since the 1950s, is easy to for patients to grasp and has been popularized in the press and internalized by the public—as one patient with a 95% stenosis put it, “I was 95% dead.” In that model, angioplasty and stenting can resolve the blockage and “fix” the problem, much as a plumber can clear your pipes with a Roto-Rooter.
Despite the visual appeal of this model,1 it doesn’t accurately convey what we know about the pathophysiology of coronary artery disease. Instead of a gradual buildup of fatty deposits, low-density lipoprotein cholesterol particles infiltrate arterial walls and trigger an inflammatory reaction as they are engulfed by macrophages, leading to a cascade of cytokines and recruitment of more inflammatory cells.2 This immune response can eventually cause the rupture of the plaque’s fibrous cap, triggering thrombosis and infarction, often at a site of insignificant stenosis.
In this new model, coronary artery disease is primarily a problem of inflammation distributed throughout the vasculature, rather than a mechanical problem localized to the site of a significant stenosis.
Significant stenosis does not equal unstable plaque
Not all plaques are equally likely to rupture. Stable plaques tend to be long-standing and calcified, with a thick fibrous cap. A stable plaque causing a 95% stenosis may cause symptoms with exertion, but it is unlikely to cause infarction.3 Conversely, rupture-prone plaques may cause little stenosis, but a large and dangerous plaque may be lurking beneath the thin fibrous cap.
Relying on angiography can be misleading. Treating all significant stenoses improves blood flow, but does not reduce the risk of infarction, because infarction most often occurs in areas where the lumen is not obstructed. A plaque causing only 30% stenosis can suddenly rupture, causing thrombosis and complete occlusion.
The current model explains why PCI is no better than optimal medical therapy (ie, risk factor modification, antiplatelet therapy with aspirin, and a statin). Diet, exercise, smoking cessation, and statins target inflammatory processes and lower low-density lipoprotein cholesterol levels, while aspirin prevents platelet aggregation, among other likely actions.
The model also explains why coronary artery bypass grafting reduces the risk of MI and death in patients with left main or 3-vessel disease. A patient with generalized coronary artery disease has multiple lesions, many of which do not cause significant stenoses. PCI corrects only a single stenosis, whereas coronary artery bypass grafting circumvents all the vulnerable plaques in a vessel.
THE LANDMARK COURAGE TRIAL
Published in 2007, the Clinical Outcomes Utilizing Revascularization and Aggressive Drug Evaluation (COURAGE) trial4 randomized more than 2,000 patients to receive either optimal medical therapy plus PCI or optimal medical therapy alone. The primary outcome was a composite of death from any cause and nonfatal MI. Patients were followed for at least 3 years, and some for as long as 7 years.
There was an initial small upward spike in the primary outcome in the PCI arm due to periprocedural events. By 5 years, the outcomes of the 2 arms converged and then stayed the same for up to 15 years.5 The authors concluded that PCI conferred no benefit over optimal medical therapy in the risk of death or MI.
Some doctors dismiss the study because of its stringent entry criteria—of 35,539 patients assessed, only 3,071 met the eligibility criteria. However, the entry criteria were meant to identify patients most likely to benefit from PCI. Many patients who undergo PCI today would not have qualified for the study because they lack objective evidence of ischemia.6 To enroll, patients needed a proximal stenosis of at least 70% and objective evidence of ischemia or a coronary stenosis of more than 80% and classic angina. Exclusion criteria disqualified few patients: Canadian Cardiovascular Society class IV angina (ie, angina evoked from minimal activity or at rest); a markedly positive stress test (substantial ST-segment depression or hypotension during stage I of the Bruce protocol); refractory heart failure or cardiogenic shock; an ejection fraction of less than 30%; revascularization within the past 6 months; and coronary anatomy unsuitable for PCI.
OTHER TRIALS SUPPORT COURAGE FINDINGS
Although COURAGE was hailed as a landmark trial, it largely supported the results of previous studies. A meta-analysis of PCI vs optimal medical therapy published in 2005 found no significant differences in death, cardiac death, MI, or nonfatal MI.7 MI was actually slightly more common in the PCI group due to the increased risk of MI during the periprocedural period.
Nor has the evidence from COURAGE discouraged additional studies of the same topic. Despite consistent findings that fit with our understanding of coronary disease as inflammation, we continue to conduct studies aimed at addressing significant stenosis, as if that was the problem. Thus, there have been studies of angioplasty alone, followed by studies of bare-metal stents and then drug-eluting stents.
In 2009, Trikalinos et al published a review of 61 randomized controlled trials comprising more than 25,000 patients with stable coronary disease and comparing medical therapy and angioplasty in its various forms over the previous 20 years.8 In all direct and indirect comparisons of PCI and medical therapy, there were no improvements in rates of death or MI.
Even so, the studies continue. The most recent “improvement” was the addition of fractional flow reserve, which served as the inclusion criterion for the Fractional Flow Reserve versus Angiography for Multivessel Evaluation 2 (FAME 2) trial.9 In that study, patients with at least 1 stenosis with a fractional flow reserve less than 0.80 were randomized to PCI plus medical therapy or to medical therapy alone. The primary end point was a composite of death from any cause, MI, and urgent revascularization. Unfortunately, the study was stopped early when the primary end point was met due to a reduction in the need for urgent revascularization. There was no reduction in the rate of MI (hazard ratio 1.05, 95% confidence interval 0.51–2.19).
The reduction in urgent revascularization has also been shown consistently in past studies, but this is the weakest outcome measure because it does not equate to a reduction in the rate of MI. There is no demonstrable harm to putting off stent placement, even in functionally significant arteries, and most patients do not require a stent, even in the future.
In summary, the primary benefit of getting a stent now is a reduced likelihood of needing one later.
PCI MAY IMPROVE ANGINA FASTER
Another important finding of the COURAGE trial was that PCI improved symptoms more than optimal medical therapy.10 This is not surprising, because angina is often a direct result of a significant stenosis. What was unexpected was that even after PCI, most patients were not symptom-free. At 1 month, significantly more PCI patients were angina-free (42%) than were medical patients (33%). This translates into an absolute risk reduction of 9% or a number needed to treat of 11 to prevent 1 case of angina.
Patients in both groups improved over time, and after 3 years, the difference between the 2 groups was no longer significant: 59% in the PCI group vs 56% in the medical therapy group were angina-free.
A more recent study has raised the possibility that the improvement in angina with PCI is primarily a placebo effect. Researchers in the United Kingdom randomized patients with stable angina and at least a 70% stenosis of one vessel to either PCI or sham PCI, in which they threaded the catheter but did not deploy the stent.11 All patients received aggressive antianginal therapy before the procedure. At 6 weeks, there was improvement in angina in both groups, but no statistically significant difference between them in either exercise time or angina. Approximately half the patients in each group improved by at least 1 grade on the Canadian Cardiovascular Society angina classification, and more than 20% improved 2 grades.
This finding is not without precedent. Ligation of the internal mammary arteries, a popular treatment for angina in the 1950s, often provided dramatic relief of symptoms, until it was proven to be no better than a sham operation.12,13 More recently, a placebo-controlled trial of percutaneous laser myocardial revascularization also failed to show improvement over a sham treatment, despite promising results from a phase 1 trial.14 Together, these studies emphasize the subjective nature of angina as an outcome and call into question the routine use of PCI to relieve it.
PCI ENTAILS RISK
PCI entails a small but not inconsequential risk. During the procedure, 2% of patients develop bleeding or blood vessel damage, and another 1% die or have an MI or a stroke. In the first year after stent placement, 3% of patients have a bleeding event from the antiplatelet therapy needed for the stent, and an additional 2% develop a clot in the stent that leads to MI.15
INFORMED CONSENT IS CRITICAL
As demonstrated above, for patients with stable angina, the only evidence-based benefit of PCI over optimal medical therapy is that symptoms may respond faster. At the same time, there are costs and risks associated with the procedure. Because symptoms are subjective, patients should play a key role in deciding whether PCI is appropriate for them.
The American Medical Association states that a physician providing any treatment or procedure should disclose and discuss with patients the risks and benefits. Unfortunately, a substantial body of evidence demonstrates that this is not occurring in practice.
Patients and cardiologists have conflicting beliefs about PCI
Studies over the past 20 years demonstrate that patients with chronic stable angina consistently overestimate the benefits of PCI, with 71% to 88% believing that it will reduce their chance of death.16–19 Patients also understand that PCI can relieve their symptoms, though no study seems to have assessed the perceived magnitude of this benefit.
In contrast, when cardiologists were asked about the benefits their patients could expect from PCI, only 20% said that it would reduce mortality and 25% said it would prevent MI.18 These are still surprisingly high percentages, since the study was conducted after the COURAGE trial.
Nevertheless, these differences in perception show that cardiologists fail to successfully communicate the benefits of the procedure to their patients. Without complete information, patients cannot make informed decisions.
Cardiologists’ reasons for performing PCI
If PCI cannot improve hard outcomes like MI or death in stable coronary disease, why do cardiologists continue to perform it so frequently?
Soon after the COURAGE trial, Lin et al conducted focus groups with cardiologists to find out.20 Some said that they doubted the clinical trial evidence, given the reduction in the cardiac mortality rate over the past 30 years. Others remarked that their overriding goal is to stamp out ischemia, and that once a lesion is found by catheterization, one must proceed with PCI. This has been termed the “oculostenotic reflex,” ie, the interventionist sees coronary artery disease and immediately places a stent.
Atreya et al found objective evidence of this practice.21 In a 2016 study of 207 patients with obstructive lesions amenable to PCI, the only factors associated with medical management were those that increased the risk of the procedure: age, chronic kidney disease, distal location of the lesion, and type C lesions (the most difficult ones to treat by PCI). More important, evidence of ischemia, presence of angina, and being on optimal medical therapy or maximal antianginal therapy were not associated with PCI.
When surveyed, cardiologists offered reasons similar to those identified by Lin et al, including a positive stress test (70%) and significant myocardium at risk (50%).18 Optimal medical therapy failure was cited less often (40%). Over 30% identified relief of chest pain for patients who were not prescribed optimal medical therapy. Another 30% said that patient anxiety contributed to their decision, but patients who reported anxiety were not more likely to get PCI than those who did not.
True informed consent rarely occurs
Surveys of patients and recordings of doctor visits suggest that doctors often discuss the risks of the procedure but rarely accurately describe the benefits or mention alternative treatments, including optimal medical therapy.
Fowler et al22 surveyed 472 Medicare patients who had undergone PCI in the past year about their consent discussion, particularly regarding alternative options. Only 6% of patients recalled discussing medication as a serious option with their doctor.
In 2 published studies,23,24 we analyzed recorded conversations between doctors and patients in which angiography and PCI were discussed.
In a qualitative assessment of how cardiologists presented the rationale for PCI to patients,23 we observed that cardiologists gave an accurate presentation of the benefits in only 5% of cases. In 13% of the conversations the benefits were explicitly overstated (eg, “If you don’t do it [angiogram/PCI], what could happen? Well, you could…have a heart attack involving that area which can lead to a sudden death”). In another 35% of cases, physicians offered an implicit overstatement of the benefit by saying they could “fix” the problem (eg, “So that’s where we start thinking, Well maybe we better try to fix that [blockage]”), without specifically stating that fixing the problem would offer any benefit. Patients were left to fill in the blanks. Conversations frequently focused on the rationale for performing PCI (eg, ischemia on a stress test) and a description of the procedure, rather than on the risks and benefits.
In a quantitative study of the same data set, we assessed how often physicians addressed the 7 elements of informed decision-making as defined by Braddock et al.24
- Explaining the patient’s role in decision-making (ie, that the patient has a choice to make) was present in half of the conversations. Sometimes a doctor would simply say, “The next step is to perform catheterization.”
- Discussion of clinical issues (eg, having a blockage, stress test results) was performed in almost every case, demonstrating physicians’ comfort with that element.
- Discussing treatment alternatives occurred in only 1 in 4 conversations. This was more frequent than previously reported, and appeared most often when patients expressed hesitancy about proceeding to PCI.
- Discussing pros and cons of the alternatives was done in 42%.
- Uncertainty about the procedure (eg, that it might not relieve the angina) was expressed in only 10% of conversations.
- Assessment of patient understanding was done in 65% of cases. This included even minimal efforts (eg, “Do you have any questions?”). More advanced methods such as teach-back were never used.
- Exploration of patient preferences (eg, asking patients which treatment they prefer, or attempting to understand how angina affects a patient’s life) the final element, occurred in 73% of conversations.
Only 3% of the conversations contained all 7 elements. Even using a more relaxed definition of 3 critical elements (ie, discussing clinical issues, treatment alternatives, and pros and cons), only 13% of conversations included them all.
Discussion affects decisions
Informed decision-making is not only important because of its ethical implications. Offering patients more information was associated with their choosing not to have PCI. The probability of a patient undergoing PCI was negatively associated with 3 specific elements of informed decision-making. Patients were less likely to choose PCI if the patient’s role in decision-making was discussed (61% vs 86% chose PCI, P < .03); if alternatives were discussed (31% vs 89% chose PCI, P < .01); and if uncertainties were discussed (17% vs 80% chose PCI, P < .01).
There was also a linear relationship between the total number of elements discussed and the probability of choosing PCI: it ranged from 100% of patients choosing PCI when just 1 element was present to 3% of patients choosing PCI when all 7 elements were present. The relationship is not entirely causal, since doctors were more likely to talk about alternatives and risks if patients hesitated and raised questions. Cautious patients received more information.
From these observational studies, we know that physicians do not generally communicate the benefits of PCI, and patients make incorrect assumptions about the benefits they can expect. We know that those who receive more information are less likely to choose PCI, but what would happen if patients were randomly assigned to receive complete information?
An online survey
We conducted an online survey of more than 1,000 participants over age 50 who had never undergone PCI, asking them to imagine visiting a cardiologist after having a positive stress test for stable chest pain.25 Three intervention groups read different scenarios couched as information provided by their cardiologist:
- The “standard care” group received no specific information about the effects of PCI on the risk of myocardial infarction
- The “specific information” group was specifically told that PCI does not reduce the risk of myocardial infarction
- The “explanatory information” group was told how medications work and why PCI does not reduce the risk of myocardial infarction.
All 3 groups received information about the risks of PCI, its role in reducing angina, and the risks and benefits of optimal medical therapy.
After reading their scenario, all participants completed an identical questionnaire, which asked if they would opt for PCI, medical therapy, or both. Overall, 55% chose PCI, ranging from 70% in the standard care group to 46% in the group given explanatory information. Rates in the specific-information and explanatory-information groups were not statistically different from each other, but both were significantly different from that in the standard-care group. Interestingly, the more information patients were given about PCI, the more likely they were to choose optimal medical therapy.
After reading the scenario, participants were also asked if PCI would “prevent a heart attack.” Of those who received standard care, 71% endorsed that belief, which is remarkably similar to studies of real patients who have received standard care. In contrast, only 39% of those given specific information and 31% given explanatory information held that belief. Moreover, the belief that PCI prevented MI was the strongest predictor of choosing PCI (odds ratio 5.82, 95% confidence interval 4.13–8.26).25
Interestingly, 52% of the standard care group falsely remembered that the doctor had told them that PCI would prevent an MI, even though the doctor said nothing about it one way or the other. It appears that participants were projecting their own beliefs onto the encounter. This highlights the importance of providing full information to patients who are considering this procedure.
TOWARD SHARED DECISION-MAKING
Shared decision-making is a process in which physicians enter into a partnership with a patient, offer information, elicit the patient’s preferences, and then come to a decision in concert with the patient.
Although many decisions can and should involve elements of shared decision-making, the decision to proceed with PCI for stable angina is particularly well-suited to shared decision-making. This is because the benefit of PCI depends on the value a patient attaches to being free of angina sooner. Since there is no difference in the risk of MI or death, the patient must decide if the risks of the procedure and the inconvenience of taking dual antiplatelet therapy are worth the benefit of improving symptoms faster. Presumably, patients who have more severe symptoms or experienced side effects from antianginal therapy would be more likely to choose PCI.
Despite having substantial experience educating patients, most physicians are unfamiliar with the process of shared decision-making. In particular, the process of eliciting preferences is often overlooked.
To address this issue, researchers at the Mayo Clinic developed a decision aid that compares PCI plus optimal medical therapy vs optimal medical therapy alone in an easily understandable information card.15 On one side, the 2 options are clearly stated, with the magnitude of symptom improvement over time graphically illustrated and the statement, “NO DIFFERENCE in heart attack or death,” prominently displayed. The back of the card discusses the risks of each option in easily understood tables.
The decision aid was compared with standard care in a randomized trial involving patients who were referred for catheterization and possible PCI.26 The decision aid improved patients’ overall knowledge about PCI. In particular, 60% of those who used the decision aid knew that PCI did not prevent death or MI vs 40% of usual-care patients—results similar to those of the online experiment.
Interestingly, the decision about whether to undergo PCI did not differ significantly between the 2 groups, although there was a trend toward more patients in the decision-aid group choosing medical therapy alone (53%) vs the standard-care patients (39%).
To understand why the decision aid did not make more of a difference, the investigators performed qualitative interviews of the cardiologists in the study.27 One theme was the timing of the intervention. Patients using the decision aid had already been referred for catheterization, and some felt the process should have occurred earlier. Engaging in shared decision-making with a general cardiologist before referral could help to improve the quality of patient decisions.
Cardiologists also noted the difficulty in changing their work flow to incorporate the decision aid. Although some embraced the idea of shared decision-making, others were concerned that many patients could not participate, and there was confusion about the difference between an educational tool, which could be used by a patient alone, and a decision aid, which is meant to generate discussion between the doctor and patient. Some expressed interest in using the tool in the future.
These findings serve to emphasize that providing information alone is not enough. If the physician does not “buy in” to the idea of shared decision-making, it will not occur.
PRACTICE IMPLICATIONS
Based on the pathophysiology of coronary artery disease and the results of multiple randomized controlled trials, it is evident that PCI does not prevent heart attacks in patients with chronic stable angina. However, most patients who undergo PCI are unaware of this and therefore do not truly give informed consent. In the absence of explicit information to the contrary, most patients with stable angina assume that PCI prevents MI and thus are biased toward choosing PCI.
Even minimal amounts of explicit information can partially overcome that bias and influence decision-making. In particular, explaining why PCI does not prevent MI was the most effective means of overcoming the bias.
To this end, shared decision aids may help physicians to engage in shared decision-making. Shared decision-making is most likely to occur if physicians are trained in the concept of shared decision-making, are committed to practicing it, and can fit it into their work flow. Ideally, this would occur in the office of a general cardiologist before referral for PCI.
For those practicing in accountable-care organizations, Medicare has recently introduced the shared decision-making model for 6 preference-sensitive conditions, including stable ischemic heart disease. Participants in this program will have the opportunity to receive payments for shared decision-making services and to share in any savings that result from reduced use of resources. Use of these tools holds the promise for providing more patient-centered care at lower cost.
- Jones DS. Visions of a cure. Visualization, clinical trials, and controversies in cardiac therapeutics, 1968–1998. Isis 2000; 91:504–541.
- Hansson G. Inflammation, atherosclerosis, and coronary artery disease. N Engl J Med 2005; 352:1685–1695.
- Stone GW, Maehara A, Lansky AJ, et al. A prospective natural-history study of coronary atherosclerosis. N Engl J Med 2011; 364:226–235.
- Boden WE, O’Rourke RA, Teo KK, et al. Optimal medical therapy with or without PCI for stable coronary disease. N Engl J Med 2007; 356:1503–1516.
- Sedlis SP, Hartigan PM, Teo KK, et al. Effect of PCI on long-term survival in patients with stable ischemic heart disease. N Engl J Med 2015; 373:1937–1946.
- Lin GA, Dudley RA, Lucas FL, Malenka DJ, Vittinghoff E, Redberg RF. Frequency of stress testing to document ischemia prior to elective percutaneous coronary intervention. JAMA 2008; 300:1765–1773.
- Katritsis DG, Ioannidis JP. Percutaneous coronary intervention versus conservative therapy in nonacute coronary artery disease: a meta-analysis. Circulation 2005; 111:2906–2912.
- Trikalinos TA, Alsheikh-Ali AA, Tatsioni A, Nallamothu BK, Kent DM. Percutaneous coronary interventions for non-acute coronary artery disease: a quantitative 20-year synopsis and a network meta-analysis. Lancet 2009; 373:911–918.
- De Bruyne B, Pijls NHJ, Kalesan B, et al. Fractional flow reserve–guided PCI versus medical therapy in stable coronary disease. N Engl J Med 2012; 367:991–1001.
- Weintraub WS, Spertus JA, Kolm P, et al. Effect of PCI on quality of life in patients with stable coronary disease. N Engl J Med 2008; 359:677–687.
- Al-Lamee R, Thompson D, Dehbi H-M, et al, on behalf of the ORBITA Investigators. Percutaneous coronary intervention in stable angina (ORBITA): a double-blind, randomised controlled trial. Lancet. Published online November 2, 2017. http://dx.doi.org/10.1016/S0140-6736(17)32714-9. Accessed November 10, 2017.
- Cobb LA, Thomas GI, Dillard DH, et al. An evaluation of internal mammary-artery ligation by a double-blind technic. N Engl J Med 1959; 260:1115–1118.
- Dimond EG, Fittle F, Crockett JE. Comparison of internal mammary artery ligation and sham operation for angina pectoris. Am J Cardiol 1960; 5:483-486.
- Leon MB, Kornowski R, Downey WE, et al. A blinded, randomized placebo-controlled trial of percutaneous laser myocardial revascularization to improve angina symptoms in patients with severe coronary disease. J Am Coll Cardiol 2005; 46:1812–1819.
- Coylewright M, Shepel K, Leblanc A, et al. Shared decision making in patients with stable coronary artery disease: PCI choice. PLoS One 2012; 7:e49827.
- Holmboe ES, Fiellin DA, Cusanelli E, Remetz M, Krumholz HM. Perceptions of benefit and risk of patients undergoing first-time elective percutaneous coronary revascularization. J Gen Intern Med 2000; 15:632–637.
- Kee F, McDonald P, Gaffney B. Risks and benefits of coronary angioplasty: the patients perspective: a preliminary study. Qual Health Care 1997; 6:131–139.
- Rothberg MB, Sivalingam SK, Ashraf J, et al. Patients’ and cardiologists’ perceptions of the benefits of percutaneous coronary intervention for stable coronary disease. Ann Intern Med 2010; 153:307–313.
- Whittle J, Conigliaro J, Good CB, Kelley ME, Skanderson M. Understanding of the benefits of coronary revascularization procedures among patients who are offered such procedures. Am Heart J 2007; 154:662–668.
- Lin GA, Dudley RA, Redberg RF. Cardiologists’ use of percutaneous coronary interventions for stable coronary artery disease. Arch Intern Med 2007; 167:1604–1609.
- Atreya AR, Sivalingam SK, Arora S, et al. Predictors of medical management in patients undergoing elective cardiac catheterization for chronic ischemic heart disease. Clin Cardiol 2016; 39:207–214.
- Fowler FJ Jr, Gallagher PM, Bynum JP, Barry MJ, Lucas FL, Skinner JS. Decision-making process reported by Medicare patients who had coronary artery stenting or surgery for prostate cancer. J Gen Intern Med 2012; 27:911–916.
- Goff SL, Mazor KM, Ting HH, Kleppel R, Rothberg MB. How cardiologists present the benefits of percutaneous coronary interventions to patients with stable angina: a qualitative analysis. JAMA Intern Med 2014; 174:1614–1621.
- Braddock CH 3rd, Edwards KA, Hasenberg NM, Laidley TL, Levinson W. Informed decision making in outpatient practice: time to get back to basics. JAMA 1999; 282:2313–2320.
- Rothberg MB, Scherer L, Kashef MA, et al. The effect of information presentation on beliefs about the benefits of elective percutaneous coronary intervention. JAMA Intern Med 2014; 174:1623–1629.
- Coylewright M, Dick S, Zmolek B, et al. PCI choice decision aid for stable coronary artery disease: a randomized trial. Circ Cardiovasc Qual Outcomes 2016; 9:767–776.
- Coylewright M, O’Neill ES, Dick S, Grande SW. PCI choice: cardiovascular clinicians’ perceptions of shared decision making in stable coronary artery disease. Patient Educ Couns 2017; 100:1136–1143.
Multiple randomized controlled trials have compared percutaneous coronary intervention (PCI) vs optimal medical therapy for patients with chronic stable angina. All have consistently shown that PCI does not reduce the risk of death or even myocardial infarction (MI) but that it may relieve angina temporarily. Nevertheless, PCI is still commonly performed for patients with stable coronary disease, often in the absence of angina, and patients mistakenly believe the procedure is life-saving. Cardiologists may not be aware of patients’ misperceptions, or worse, may encourage them. In either case, if patients do not understand the benefits of the procedure, they cannot give informed consent.
This article reviews the pathophysiology of coronary artery disease, evidence from clinical trials of the value of PCI for chronic stable angina, patient and physician perceptions of PCI, and ways to promote patient-centered, shared decision-making.
CLINICAL CASE: EXERTIONAL ANGINA
While climbing 4 flights of stairs, a 55-year-old man noticed tightness in his chest, which lasted for 5 minutes and resolved spontaneously. Several weeks later, when visiting his primary care physician, he mentioned the episode. He had had no symptoms in the interim, but the physician ordered an exercise stress test.
Six minutes into a standard Bruce protocol, the patient experienced the same chest tightness, accompanied by 1-mm ST-segment depressions in leads II, III, and aVF. He was then referred to a cardiologist, who recommended catheterization.
Catheterization demonstrated a 95% stenosis of the right coronary artery with nonsignificant stenoses of the left anterior descending and circumflex arteries. A drug-eluting stent was placed in the right coronary artery, with no residual stenosis.
Did this intervention likely prevent an MI and perhaps save the man’s life?
HOW MYOCARDIAL INFARCTION HAPPENS
Understanding the pathogenesis of MI is critical to having realistic expectations of the benefits of stent placement.
Doctors often describe coronary atherosclerosis as a plumbing problem, where deposits of cholesterol and fat build up in arterial walls, clogging the pipes and eventually causing a heart attack. This analogy, which has been around since the 1950s, is easy to for patients to grasp and has been popularized in the press and internalized by the public—as one patient with a 95% stenosis put it, “I was 95% dead.” In that model, angioplasty and stenting can resolve the blockage and “fix” the problem, much as a plumber can clear your pipes with a Roto-Rooter.
Despite the visual appeal of this model,1 it doesn’t accurately convey what we know about the pathophysiology of coronary artery disease. Instead of a gradual buildup of fatty deposits, low-density lipoprotein cholesterol particles infiltrate arterial walls and trigger an inflammatory reaction as they are engulfed by macrophages, leading to a cascade of cytokines and recruitment of more inflammatory cells.2 This immune response can eventually cause the rupture of the plaque’s fibrous cap, triggering thrombosis and infarction, often at a site of insignificant stenosis.
In this new model, coronary artery disease is primarily a problem of inflammation distributed throughout the vasculature, rather than a mechanical problem localized to the site of a significant stenosis.
Significant stenosis does not equal unstable plaque
Not all plaques are equally likely to rupture. Stable plaques tend to be long-standing and calcified, with a thick fibrous cap. A stable plaque causing a 95% stenosis may cause symptoms with exertion, but it is unlikely to cause infarction.3 Conversely, rupture-prone plaques may cause little stenosis, but a large and dangerous plaque may be lurking beneath the thin fibrous cap.
Relying on angiography can be misleading. Treating all significant stenoses improves blood flow, but does not reduce the risk of infarction, because infarction most often occurs in areas where the lumen is not obstructed. A plaque causing only 30% stenosis can suddenly rupture, causing thrombosis and complete occlusion.
The current model explains why PCI is no better than optimal medical therapy (ie, risk factor modification, antiplatelet therapy with aspirin, and a statin). Diet, exercise, smoking cessation, and statins target inflammatory processes and lower low-density lipoprotein cholesterol levels, while aspirin prevents platelet aggregation, among other likely actions.
The model also explains why coronary artery bypass grafting reduces the risk of MI and death in patients with left main or 3-vessel disease. A patient with generalized coronary artery disease has multiple lesions, many of which do not cause significant stenoses. PCI corrects only a single stenosis, whereas coronary artery bypass grafting circumvents all the vulnerable plaques in a vessel.
THE LANDMARK COURAGE TRIAL
Published in 2007, the Clinical Outcomes Utilizing Revascularization and Aggressive Drug Evaluation (COURAGE) trial4 randomized more than 2,000 patients to receive either optimal medical therapy plus PCI or optimal medical therapy alone. The primary outcome was a composite of death from any cause and nonfatal MI. Patients were followed for at least 3 years, and some for as long as 7 years.
There was an initial small upward spike in the primary outcome in the PCI arm due to periprocedural events. By 5 years, the outcomes of the 2 arms converged and then stayed the same for up to 15 years.5 The authors concluded that PCI conferred no benefit over optimal medical therapy in the risk of death or MI.
Some doctors dismiss the study because of its stringent entry criteria—of 35,539 patients assessed, only 3,071 met the eligibility criteria. However, the entry criteria were meant to identify patients most likely to benefit from PCI. Many patients who undergo PCI today would not have qualified for the study because they lack objective evidence of ischemia.6 To enroll, patients needed a proximal stenosis of at least 70% and objective evidence of ischemia or a coronary stenosis of more than 80% and classic angina. Exclusion criteria disqualified few patients: Canadian Cardiovascular Society class IV angina (ie, angina evoked from minimal activity or at rest); a markedly positive stress test (substantial ST-segment depression or hypotension during stage I of the Bruce protocol); refractory heart failure or cardiogenic shock; an ejection fraction of less than 30%; revascularization within the past 6 months; and coronary anatomy unsuitable for PCI.
OTHER TRIALS SUPPORT COURAGE FINDINGS
Although COURAGE was hailed as a landmark trial, it largely supported the results of previous studies. A meta-analysis of PCI vs optimal medical therapy published in 2005 found no significant differences in death, cardiac death, MI, or nonfatal MI.7 MI was actually slightly more common in the PCI group due to the increased risk of MI during the periprocedural period.
Nor has the evidence from COURAGE discouraged additional studies of the same topic. Despite consistent findings that fit with our understanding of coronary disease as inflammation, we continue to conduct studies aimed at addressing significant stenosis, as if that was the problem. Thus, there have been studies of angioplasty alone, followed by studies of bare-metal stents and then drug-eluting stents.
In 2009, Trikalinos et al published a review of 61 randomized controlled trials comprising more than 25,000 patients with stable coronary disease and comparing medical therapy and angioplasty in its various forms over the previous 20 years.8 In all direct and indirect comparisons of PCI and medical therapy, there were no improvements in rates of death or MI.
Even so, the studies continue. The most recent “improvement” was the addition of fractional flow reserve, which served as the inclusion criterion for the Fractional Flow Reserve versus Angiography for Multivessel Evaluation 2 (FAME 2) trial.9 In that study, patients with at least 1 stenosis with a fractional flow reserve less than 0.80 were randomized to PCI plus medical therapy or to medical therapy alone. The primary end point was a composite of death from any cause, MI, and urgent revascularization. Unfortunately, the study was stopped early when the primary end point was met due to a reduction in the need for urgent revascularization. There was no reduction in the rate of MI (hazard ratio 1.05, 95% confidence interval 0.51–2.19).
The reduction in urgent revascularization has also been shown consistently in past studies, but this is the weakest outcome measure because it does not equate to a reduction in the rate of MI. There is no demonstrable harm to putting off stent placement, even in functionally significant arteries, and most patients do not require a stent, even in the future.
In summary, the primary benefit of getting a stent now is a reduced likelihood of needing one later.
PCI MAY IMPROVE ANGINA FASTER
Another important finding of the COURAGE trial was that PCI improved symptoms more than optimal medical therapy.10 This is not surprising, because angina is often a direct result of a significant stenosis. What was unexpected was that even after PCI, most patients were not symptom-free. At 1 month, significantly more PCI patients were angina-free (42%) than were medical patients (33%). This translates into an absolute risk reduction of 9% or a number needed to treat of 11 to prevent 1 case of angina.
Patients in both groups improved over time, and after 3 years, the difference between the 2 groups was no longer significant: 59% in the PCI group vs 56% in the medical therapy group were angina-free.
A more recent study has raised the possibility that the improvement in angina with PCI is primarily a placebo effect. Researchers in the United Kingdom randomized patients with stable angina and at least a 70% stenosis of one vessel to either PCI or sham PCI, in which they threaded the catheter but did not deploy the stent.11 All patients received aggressive antianginal therapy before the procedure. At 6 weeks, there was improvement in angina in both groups, but no statistically significant difference between them in either exercise time or angina. Approximately half the patients in each group improved by at least 1 grade on the Canadian Cardiovascular Society angina classification, and more than 20% improved 2 grades.
This finding is not without precedent. Ligation of the internal mammary arteries, a popular treatment for angina in the 1950s, often provided dramatic relief of symptoms, until it was proven to be no better than a sham operation.12,13 More recently, a placebo-controlled trial of percutaneous laser myocardial revascularization also failed to show improvement over a sham treatment, despite promising results from a phase 1 trial.14 Together, these studies emphasize the subjective nature of angina as an outcome and call into question the routine use of PCI to relieve it.
PCI ENTAILS RISK
PCI entails a small but not inconsequential risk. During the procedure, 2% of patients develop bleeding or blood vessel damage, and another 1% die or have an MI or a stroke. In the first year after stent placement, 3% of patients have a bleeding event from the antiplatelet therapy needed for the stent, and an additional 2% develop a clot in the stent that leads to MI.15
INFORMED CONSENT IS CRITICAL
As demonstrated above, for patients with stable angina, the only evidence-based benefit of PCI over optimal medical therapy is that symptoms may respond faster. At the same time, there are costs and risks associated with the procedure. Because symptoms are subjective, patients should play a key role in deciding whether PCI is appropriate for them.
The American Medical Association states that a physician providing any treatment or procedure should disclose and discuss with patients the risks and benefits. Unfortunately, a substantial body of evidence demonstrates that this is not occurring in practice.
Patients and cardiologists have conflicting beliefs about PCI
Studies over the past 20 years demonstrate that patients with chronic stable angina consistently overestimate the benefits of PCI, with 71% to 88% believing that it will reduce their chance of death.16–19 Patients also understand that PCI can relieve their symptoms, though no study seems to have assessed the perceived magnitude of this benefit.
In contrast, when cardiologists were asked about the benefits their patients could expect from PCI, only 20% said that it would reduce mortality and 25% said it would prevent MI.18 These are still surprisingly high percentages, since the study was conducted after the COURAGE trial.
Nevertheless, these differences in perception show that cardiologists fail to successfully communicate the benefits of the procedure to their patients. Without complete information, patients cannot make informed decisions.
Cardiologists’ reasons for performing PCI
If PCI cannot improve hard outcomes like MI or death in stable coronary disease, why do cardiologists continue to perform it so frequently?
Soon after the COURAGE trial, Lin et al conducted focus groups with cardiologists to find out.20 Some said that they doubted the clinical trial evidence, given the reduction in the cardiac mortality rate over the past 30 years. Others remarked that their overriding goal is to stamp out ischemia, and that once a lesion is found by catheterization, one must proceed with PCI. This has been termed the “oculostenotic reflex,” ie, the interventionist sees coronary artery disease and immediately places a stent.
Atreya et al found objective evidence of this practice.21 In a 2016 study of 207 patients with obstructive lesions amenable to PCI, the only factors associated with medical management were those that increased the risk of the procedure: age, chronic kidney disease, distal location of the lesion, and type C lesions (the most difficult ones to treat by PCI). More important, evidence of ischemia, presence of angina, and being on optimal medical therapy or maximal antianginal therapy were not associated with PCI.
When surveyed, cardiologists offered reasons similar to those identified by Lin et al, including a positive stress test (70%) and significant myocardium at risk (50%).18 Optimal medical therapy failure was cited less often (40%). Over 30% identified relief of chest pain for patients who were not prescribed optimal medical therapy. Another 30% said that patient anxiety contributed to their decision, but patients who reported anxiety were not more likely to get PCI than those who did not.
True informed consent rarely occurs
Surveys of patients and recordings of doctor visits suggest that doctors often discuss the risks of the procedure but rarely accurately describe the benefits or mention alternative treatments, including optimal medical therapy.
Fowler et al22 surveyed 472 Medicare patients who had undergone PCI in the past year about their consent discussion, particularly regarding alternative options. Only 6% of patients recalled discussing medication as a serious option with their doctor.
In 2 published studies,23,24 we analyzed recorded conversations between doctors and patients in which angiography and PCI were discussed.
In a qualitative assessment of how cardiologists presented the rationale for PCI to patients,23 we observed that cardiologists gave an accurate presentation of the benefits in only 5% of cases. In 13% of the conversations the benefits were explicitly overstated (eg, “If you don’t do it [angiogram/PCI], what could happen? Well, you could…have a heart attack involving that area which can lead to a sudden death”). In another 35% of cases, physicians offered an implicit overstatement of the benefit by saying they could “fix” the problem (eg, “So that’s where we start thinking, Well maybe we better try to fix that [blockage]”), without specifically stating that fixing the problem would offer any benefit. Patients were left to fill in the blanks. Conversations frequently focused on the rationale for performing PCI (eg, ischemia on a stress test) and a description of the procedure, rather than on the risks and benefits.
In a quantitative study of the same data set, we assessed how often physicians addressed the 7 elements of informed decision-making as defined by Braddock et al.24
- Explaining the patient’s role in decision-making (ie, that the patient has a choice to make) was present in half of the conversations. Sometimes a doctor would simply say, “The next step is to perform catheterization.”
- Discussion of clinical issues (eg, having a blockage, stress test results) was performed in almost every case, demonstrating physicians’ comfort with that element.
- Discussing treatment alternatives occurred in only 1 in 4 conversations. This was more frequent than previously reported, and appeared most often when patients expressed hesitancy about proceeding to PCI.
- Discussing pros and cons of the alternatives was done in 42%.
- Uncertainty about the procedure (eg, that it might not relieve the angina) was expressed in only 10% of conversations.
- Assessment of patient understanding was done in 65% of cases. This included even minimal efforts (eg, “Do you have any questions?”). More advanced methods such as teach-back were never used.
- Exploration of patient preferences (eg, asking patients which treatment they prefer, or attempting to understand how angina affects a patient’s life) the final element, occurred in 73% of conversations.
Only 3% of the conversations contained all 7 elements. Even using a more relaxed definition of 3 critical elements (ie, discussing clinical issues, treatment alternatives, and pros and cons), only 13% of conversations included them all.
Discussion affects decisions
Informed decision-making is not only important because of its ethical implications. Offering patients more information was associated with their choosing not to have PCI. The probability of a patient undergoing PCI was negatively associated with 3 specific elements of informed decision-making. Patients were less likely to choose PCI if the patient’s role in decision-making was discussed (61% vs 86% chose PCI, P < .03); if alternatives were discussed (31% vs 89% chose PCI, P < .01); and if uncertainties were discussed (17% vs 80% chose PCI, P < .01).
There was also a linear relationship between the total number of elements discussed and the probability of choosing PCI: it ranged from 100% of patients choosing PCI when just 1 element was present to 3% of patients choosing PCI when all 7 elements were present. The relationship is not entirely causal, since doctors were more likely to talk about alternatives and risks if patients hesitated and raised questions. Cautious patients received more information.
From these observational studies, we know that physicians do not generally communicate the benefits of PCI, and patients make incorrect assumptions about the benefits they can expect. We know that those who receive more information are less likely to choose PCI, but what would happen if patients were randomly assigned to receive complete information?
An online survey
We conducted an online survey of more than 1,000 participants over age 50 who had never undergone PCI, asking them to imagine visiting a cardiologist after having a positive stress test for stable chest pain.25 Three intervention groups read different scenarios couched as information provided by their cardiologist:
- The “standard care” group received no specific information about the effects of PCI on the risk of myocardial infarction
- The “specific information” group was specifically told that PCI does not reduce the risk of myocardial infarction
- The “explanatory information” group was told how medications work and why PCI does not reduce the risk of myocardial infarction.
All 3 groups received information about the risks of PCI, its role in reducing angina, and the risks and benefits of optimal medical therapy.
After reading their scenario, all participants completed an identical questionnaire, which asked if they would opt for PCI, medical therapy, or both. Overall, 55% chose PCI, ranging from 70% in the standard care group to 46% in the group given explanatory information. Rates in the specific-information and explanatory-information groups were not statistically different from each other, but both were significantly different from that in the standard-care group. Interestingly, the more information patients were given about PCI, the more likely they were to choose optimal medical therapy.
After reading the scenario, participants were also asked if PCI would “prevent a heart attack.” Of those who received standard care, 71% endorsed that belief, which is remarkably similar to studies of real patients who have received standard care. In contrast, only 39% of those given specific information and 31% given explanatory information held that belief. Moreover, the belief that PCI prevented MI was the strongest predictor of choosing PCI (odds ratio 5.82, 95% confidence interval 4.13–8.26).25
Interestingly, 52% of the standard care group falsely remembered that the doctor had told them that PCI would prevent an MI, even though the doctor said nothing about it one way or the other. It appears that participants were projecting their own beliefs onto the encounter. This highlights the importance of providing full information to patients who are considering this procedure.
TOWARD SHARED DECISION-MAKING
Shared decision-making is a process in which physicians enter into a partnership with a patient, offer information, elicit the patient’s preferences, and then come to a decision in concert with the patient.
Although many decisions can and should involve elements of shared decision-making, the decision to proceed with PCI for stable angina is particularly well-suited to shared decision-making. This is because the benefit of PCI depends on the value a patient attaches to being free of angina sooner. Since there is no difference in the risk of MI or death, the patient must decide if the risks of the procedure and the inconvenience of taking dual antiplatelet therapy are worth the benefit of improving symptoms faster. Presumably, patients who have more severe symptoms or experienced side effects from antianginal therapy would be more likely to choose PCI.
Despite having substantial experience educating patients, most physicians are unfamiliar with the process of shared decision-making. In particular, the process of eliciting preferences is often overlooked.
To address this issue, researchers at the Mayo Clinic developed a decision aid that compares PCI plus optimal medical therapy vs optimal medical therapy alone in an easily understandable information card.15 On one side, the 2 options are clearly stated, with the magnitude of symptom improvement over time graphically illustrated and the statement, “NO DIFFERENCE in heart attack or death,” prominently displayed. The back of the card discusses the risks of each option in easily understood tables.
The decision aid was compared with standard care in a randomized trial involving patients who were referred for catheterization and possible PCI.26 The decision aid improved patients’ overall knowledge about PCI. In particular, 60% of those who used the decision aid knew that PCI did not prevent death or MI vs 40% of usual-care patients—results similar to those of the online experiment.
Interestingly, the decision about whether to undergo PCI did not differ significantly between the 2 groups, although there was a trend toward more patients in the decision-aid group choosing medical therapy alone (53%) vs the standard-care patients (39%).
To understand why the decision aid did not make more of a difference, the investigators performed qualitative interviews of the cardiologists in the study.27 One theme was the timing of the intervention. Patients using the decision aid had already been referred for catheterization, and some felt the process should have occurred earlier. Engaging in shared decision-making with a general cardiologist before referral could help to improve the quality of patient decisions.
Cardiologists also noted the difficulty in changing their work flow to incorporate the decision aid. Although some embraced the idea of shared decision-making, others were concerned that many patients could not participate, and there was confusion about the difference between an educational tool, which could be used by a patient alone, and a decision aid, which is meant to generate discussion between the doctor and patient. Some expressed interest in using the tool in the future.
These findings serve to emphasize that providing information alone is not enough. If the physician does not “buy in” to the idea of shared decision-making, it will not occur.
PRACTICE IMPLICATIONS
Based on the pathophysiology of coronary artery disease and the results of multiple randomized controlled trials, it is evident that PCI does not prevent heart attacks in patients with chronic stable angina. However, most patients who undergo PCI are unaware of this and therefore do not truly give informed consent. In the absence of explicit information to the contrary, most patients with stable angina assume that PCI prevents MI and thus are biased toward choosing PCI.
Even minimal amounts of explicit information can partially overcome that bias and influence decision-making. In particular, explaining why PCI does not prevent MI was the most effective means of overcoming the bias.
To this end, shared decision aids may help physicians to engage in shared decision-making. Shared decision-making is most likely to occur if physicians are trained in the concept of shared decision-making, are committed to practicing it, and can fit it into their work flow. Ideally, this would occur in the office of a general cardiologist before referral for PCI.
For those practicing in accountable-care organizations, Medicare has recently introduced the shared decision-making model for 6 preference-sensitive conditions, including stable ischemic heart disease. Participants in this program will have the opportunity to receive payments for shared decision-making services and to share in any savings that result from reduced use of resources. Use of these tools holds the promise for providing more patient-centered care at lower cost.
Multiple randomized controlled trials have compared percutaneous coronary intervention (PCI) vs optimal medical therapy for patients with chronic stable angina. All have consistently shown that PCI does not reduce the risk of death or even myocardial infarction (MI) but that it may relieve angina temporarily. Nevertheless, PCI is still commonly performed for patients with stable coronary disease, often in the absence of angina, and patients mistakenly believe the procedure is life-saving. Cardiologists may not be aware of patients’ misperceptions, or worse, may encourage them. In either case, if patients do not understand the benefits of the procedure, they cannot give informed consent.
This article reviews the pathophysiology of coronary artery disease, evidence from clinical trials of the value of PCI for chronic stable angina, patient and physician perceptions of PCI, and ways to promote patient-centered, shared decision-making.
CLINICAL CASE: EXERTIONAL ANGINA
While climbing 4 flights of stairs, a 55-year-old man noticed tightness in his chest, which lasted for 5 minutes and resolved spontaneously. Several weeks later, when visiting his primary care physician, he mentioned the episode. He had had no symptoms in the interim, but the physician ordered an exercise stress test.
Six minutes into a standard Bruce protocol, the patient experienced the same chest tightness, accompanied by 1-mm ST-segment depressions in leads II, III, and aVF. He was then referred to a cardiologist, who recommended catheterization.
Catheterization demonstrated a 95% stenosis of the right coronary artery with nonsignificant stenoses of the left anterior descending and circumflex arteries. A drug-eluting stent was placed in the right coronary artery, with no residual stenosis.
Did this intervention likely prevent an MI and perhaps save the man’s life?
HOW MYOCARDIAL INFARCTION HAPPENS
Understanding the pathogenesis of MI is critical to having realistic expectations of the benefits of stent placement.
Doctors often describe coronary atherosclerosis as a plumbing problem, where deposits of cholesterol and fat build up in arterial walls, clogging the pipes and eventually causing a heart attack. This analogy, which has been around since the 1950s, is easy to for patients to grasp and has been popularized in the press and internalized by the public—as one patient with a 95% stenosis put it, “I was 95% dead.” In that model, angioplasty and stenting can resolve the blockage and “fix” the problem, much as a plumber can clear your pipes with a Roto-Rooter.
Despite the visual appeal of this model,1 it doesn’t accurately convey what we know about the pathophysiology of coronary artery disease. Instead of a gradual buildup of fatty deposits, low-density lipoprotein cholesterol particles infiltrate arterial walls and trigger an inflammatory reaction as they are engulfed by macrophages, leading to a cascade of cytokines and recruitment of more inflammatory cells.2 This immune response can eventually cause the rupture of the plaque’s fibrous cap, triggering thrombosis and infarction, often at a site of insignificant stenosis.
In this new model, coronary artery disease is primarily a problem of inflammation distributed throughout the vasculature, rather than a mechanical problem localized to the site of a significant stenosis.
Significant stenosis does not equal unstable plaque
Not all plaques are equally likely to rupture. Stable plaques tend to be long-standing and calcified, with a thick fibrous cap. A stable plaque causing a 95% stenosis may cause symptoms with exertion, but it is unlikely to cause infarction.3 Conversely, rupture-prone plaques may cause little stenosis, but a large and dangerous plaque may be lurking beneath the thin fibrous cap.
Relying on angiography can be misleading. Treating all significant stenoses improves blood flow, but does not reduce the risk of infarction, because infarction most often occurs in areas where the lumen is not obstructed. A plaque causing only 30% stenosis can suddenly rupture, causing thrombosis and complete occlusion.
The current model explains why PCI is no better than optimal medical therapy (ie, risk factor modification, antiplatelet therapy with aspirin, and a statin). Diet, exercise, smoking cessation, and statins target inflammatory processes and lower low-density lipoprotein cholesterol levels, while aspirin prevents platelet aggregation, among other likely actions.
The model also explains why coronary artery bypass grafting reduces the risk of MI and death in patients with left main or 3-vessel disease. A patient with generalized coronary artery disease has multiple lesions, many of which do not cause significant stenoses. PCI corrects only a single stenosis, whereas coronary artery bypass grafting circumvents all the vulnerable plaques in a vessel.
THE LANDMARK COURAGE TRIAL
Published in 2007, the Clinical Outcomes Utilizing Revascularization and Aggressive Drug Evaluation (COURAGE) trial4 randomized more than 2,000 patients to receive either optimal medical therapy plus PCI or optimal medical therapy alone. The primary outcome was a composite of death from any cause and nonfatal MI. Patients were followed for at least 3 years, and some for as long as 7 years.
There was an initial small upward spike in the primary outcome in the PCI arm due to periprocedural events. By 5 years, the outcomes of the 2 arms converged and then stayed the same for up to 15 years.5 The authors concluded that PCI conferred no benefit over optimal medical therapy in the risk of death or MI.
Some doctors dismiss the study because of its stringent entry criteria—of 35,539 patients assessed, only 3,071 met the eligibility criteria. However, the entry criteria were meant to identify patients most likely to benefit from PCI. Many patients who undergo PCI today would not have qualified for the study because they lack objective evidence of ischemia.6 To enroll, patients needed a proximal stenosis of at least 70% and objective evidence of ischemia or a coronary stenosis of more than 80% and classic angina. Exclusion criteria disqualified few patients: Canadian Cardiovascular Society class IV angina (ie, angina evoked from minimal activity or at rest); a markedly positive stress test (substantial ST-segment depression or hypotension during stage I of the Bruce protocol); refractory heart failure or cardiogenic shock; an ejection fraction of less than 30%; revascularization within the past 6 months; and coronary anatomy unsuitable for PCI.
OTHER TRIALS SUPPORT COURAGE FINDINGS
Although COURAGE was hailed as a landmark trial, it largely supported the results of previous studies. A meta-analysis of PCI vs optimal medical therapy published in 2005 found no significant differences in death, cardiac death, MI, or nonfatal MI.7 MI was actually slightly more common in the PCI group due to the increased risk of MI during the periprocedural period.
Nor has the evidence from COURAGE discouraged additional studies of the same topic. Despite consistent findings that fit with our understanding of coronary disease as inflammation, we continue to conduct studies aimed at addressing significant stenosis, as if that was the problem. Thus, there have been studies of angioplasty alone, followed by studies of bare-metal stents and then drug-eluting stents.
In 2009, Trikalinos et al published a review of 61 randomized controlled trials comprising more than 25,000 patients with stable coronary disease and comparing medical therapy and angioplasty in its various forms over the previous 20 years.8 In all direct and indirect comparisons of PCI and medical therapy, there were no improvements in rates of death or MI.
Even so, the studies continue. The most recent “improvement” was the addition of fractional flow reserve, which served as the inclusion criterion for the Fractional Flow Reserve versus Angiography for Multivessel Evaluation 2 (FAME 2) trial.9 In that study, patients with at least 1 stenosis with a fractional flow reserve less than 0.80 were randomized to PCI plus medical therapy or to medical therapy alone. The primary end point was a composite of death from any cause, MI, and urgent revascularization. Unfortunately, the study was stopped early when the primary end point was met due to a reduction in the need for urgent revascularization. There was no reduction in the rate of MI (hazard ratio 1.05, 95% confidence interval 0.51–2.19).
The reduction in urgent revascularization has also been shown consistently in past studies, but this is the weakest outcome measure because it does not equate to a reduction in the rate of MI. There is no demonstrable harm to putting off stent placement, even in functionally significant arteries, and most patients do not require a stent, even in the future.
In summary, the primary benefit of getting a stent now is a reduced likelihood of needing one later.
PCI MAY IMPROVE ANGINA FASTER
Another important finding of the COURAGE trial was that PCI improved symptoms more than optimal medical therapy.10 This is not surprising, because angina is often a direct result of a significant stenosis. What was unexpected was that even after PCI, most patients were not symptom-free. At 1 month, significantly more PCI patients were angina-free (42%) than were medical patients (33%). This translates into an absolute risk reduction of 9% or a number needed to treat of 11 to prevent 1 case of angina.
Patients in both groups improved over time, and after 3 years, the difference between the 2 groups was no longer significant: 59% in the PCI group vs 56% in the medical therapy group were angina-free.
A more recent study has raised the possibility that the improvement in angina with PCI is primarily a placebo effect. Researchers in the United Kingdom randomized patients with stable angina and at least a 70% stenosis of one vessel to either PCI or sham PCI, in which they threaded the catheter but did not deploy the stent.11 All patients received aggressive antianginal therapy before the procedure. At 6 weeks, there was improvement in angina in both groups, but no statistically significant difference between them in either exercise time or angina. Approximately half the patients in each group improved by at least 1 grade on the Canadian Cardiovascular Society angina classification, and more than 20% improved 2 grades.
This finding is not without precedent. Ligation of the internal mammary arteries, a popular treatment for angina in the 1950s, often provided dramatic relief of symptoms, until it was proven to be no better than a sham operation.12,13 More recently, a placebo-controlled trial of percutaneous laser myocardial revascularization also failed to show improvement over a sham treatment, despite promising results from a phase 1 trial.14 Together, these studies emphasize the subjective nature of angina as an outcome and call into question the routine use of PCI to relieve it.
PCI ENTAILS RISK
PCI entails a small but not inconsequential risk. During the procedure, 2% of patients develop bleeding or blood vessel damage, and another 1% die or have an MI or a stroke. In the first year after stent placement, 3% of patients have a bleeding event from the antiplatelet therapy needed for the stent, and an additional 2% develop a clot in the stent that leads to MI.15
INFORMED CONSENT IS CRITICAL
As demonstrated above, for patients with stable angina, the only evidence-based benefit of PCI over optimal medical therapy is that symptoms may respond faster. At the same time, there are costs and risks associated with the procedure. Because symptoms are subjective, patients should play a key role in deciding whether PCI is appropriate for them.
The American Medical Association states that a physician providing any treatment or procedure should disclose and discuss with patients the risks and benefits. Unfortunately, a substantial body of evidence demonstrates that this is not occurring in practice.
Patients and cardiologists have conflicting beliefs about PCI
Studies over the past 20 years demonstrate that patients with chronic stable angina consistently overestimate the benefits of PCI, with 71% to 88% believing that it will reduce their chance of death.16–19 Patients also understand that PCI can relieve their symptoms, though no study seems to have assessed the perceived magnitude of this benefit.
In contrast, when cardiologists were asked about the benefits their patients could expect from PCI, only 20% said that it would reduce mortality and 25% said it would prevent MI.18 These are still surprisingly high percentages, since the study was conducted after the COURAGE trial.
Nevertheless, these differences in perception show that cardiologists fail to successfully communicate the benefits of the procedure to their patients. Without complete information, patients cannot make informed decisions.
Cardiologists’ reasons for performing PCI
If PCI cannot improve hard outcomes like MI or death in stable coronary disease, why do cardiologists continue to perform it so frequently?
Soon after the COURAGE trial, Lin et al conducted focus groups with cardiologists to find out.20 Some said that they doubted the clinical trial evidence, given the reduction in the cardiac mortality rate over the past 30 years. Others remarked that their overriding goal is to stamp out ischemia, and that once a lesion is found by catheterization, one must proceed with PCI. This has been termed the “oculostenotic reflex,” ie, the interventionist sees coronary artery disease and immediately places a stent.
Atreya et al found objective evidence of this practice.21 In a 2016 study of 207 patients with obstructive lesions amenable to PCI, the only factors associated with medical management were those that increased the risk of the procedure: age, chronic kidney disease, distal location of the lesion, and type C lesions (the most difficult ones to treat by PCI). More important, evidence of ischemia, presence of angina, and being on optimal medical therapy or maximal antianginal therapy were not associated with PCI.
When surveyed, cardiologists offered reasons similar to those identified by Lin et al, including a positive stress test (70%) and significant myocardium at risk (50%).18 Optimal medical therapy failure was cited less often (40%). Over 30% identified relief of chest pain for patients who were not prescribed optimal medical therapy. Another 30% said that patient anxiety contributed to their decision, but patients who reported anxiety were not more likely to get PCI than those who did not.
True informed consent rarely occurs
Surveys of patients and recordings of doctor visits suggest that doctors often discuss the risks of the procedure but rarely accurately describe the benefits or mention alternative treatments, including optimal medical therapy.
Fowler et al22 surveyed 472 Medicare patients who had undergone PCI in the past year about their consent discussion, particularly regarding alternative options. Only 6% of patients recalled discussing medication as a serious option with their doctor.
In 2 published studies,23,24 we analyzed recorded conversations between doctors and patients in which angiography and PCI were discussed.
In a qualitative assessment of how cardiologists presented the rationale for PCI to patients,23 we observed that cardiologists gave an accurate presentation of the benefits in only 5% of cases. In 13% of the conversations the benefits were explicitly overstated (eg, “If you don’t do it [angiogram/PCI], what could happen? Well, you could…have a heart attack involving that area which can lead to a sudden death”). In another 35% of cases, physicians offered an implicit overstatement of the benefit by saying they could “fix” the problem (eg, “So that’s where we start thinking, Well maybe we better try to fix that [blockage]”), without specifically stating that fixing the problem would offer any benefit. Patients were left to fill in the blanks. Conversations frequently focused on the rationale for performing PCI (eg, ischemia on a stress test) and a description of the procedure, rather than on the risks and benefits.
In a quantitative study of the same data set, we assessed how often physicians addressed the 7 elements of informed decision-making as defined by Braddock et al.24
- Explaining the patient’s role in decision-making (ie, that the patient has a choice to make) was present in half of the conversations. Sometimes a doctor would simply say, “The next step is to perform catheterization.”
- Discussion of clinical issues (eg, having a blockage, stress test results) was performed in almost every case, demonstrating physicians’ comfort with that element.
- Discussing treatment alternatives occurred in only 1 in 4 conversations. This was more frequent than previously reported, and appeared most often when patients expressed hesitancy about proceeding to PCI.
- Discussing pros and cons of the alternatives was done in 42%.
- Uncertainty about the procedure (eg, that it might not relieve the angina) was expressed in only 10% of conversations.
- Assessment of patient understanding was done in 65% of cases. This included even minimal efforts (eg, “Do you have any questions?”). More advanced methods such as teach-back were never used.
- Exploration of patient preferences (eg, asking patients which treatment they prefer, or attempting to understand how angina affects a patient’s life) the final element, occurred in 73% of conversations.
Only 3% of the conversations contained all 7 elements. Even using a more relaxed definition of 3 critical elements (ie, discussing clinical issues, treatment alternatives, and pros and cons), only 13% of conversations included them all.
Discussion affects decisions
Informed decision-making is not only important because of its ethical implications. Offering patients more information was associated with their choosing not to have PCI. The probability of a patient undergoing PCI was negatively associated with 3 specific elements of informed decision-making. Patients were less likely to choose PCI if the patient’s role in decision-making was discussed (61% vs 86% chose PCI, P < .03); if alternatives were discussed (31% vs 89% chose PCI, P < .01); and if uncertainties were discussed (17% vs 80% chose PCI, P < .01).
There was also a linear relationship between the total number of elements discussed and the probability of choosing PCI: it ranged from 100% of patients choosing PCI when just 1 element was present to 3% of patients choosing PCI when all 7 elements were present. The relationship is not entirely causal, since doctors were more likely to talk about alternatives and risks if patients hesitated and raised questions. Cautious patients received more information.
From these observational studies, we know that physicians do not generally communicate the benefits of PCI, and patients make incorrect assumptions about the benefits they can expect. We know that those who receive more information are less likely to choose PCI, but what would happen if patients were randomly assigned to receive complete information?
An online survey
We conducted an online survey of more than 1,000 participants over age 50 who had never undergone PCI, asking them to imagine visiting a cardiologist after having a positive stress test for stable chest pain.25 Three intervention groups read different scenarios couched as information provided by their cardiologist:
- The “standard care” group received no specific information about the effects of PCI on the risk of myocardial infarction
- The “specific information” group was specifically told that PCI does not reduce the risk of myocardial infarction
- The “explanatory information” group was told how medications work and why PCI does not reduce the risk of myocardial infarction.
All 3 groups received information about the risks of PCI, its role in reducing angina, and the risks and benefits of optimal medical therapy.
After reading their scenario, all participants completed an identical questionnaire, which asked if they would opt for PCI, medical therapy, or both. Overall, 55% chose PCI, ranging from 70% in the standard care group to 46% in the group given explanatory information. Rates in the specific-information and explanatory-information groups were not statistically different from each other, but both were significantly different from that in the standard-care group. Interestingly, the more information patients were given about PCI, the more likely they were to choose optimal medical therapy.
After reading the scenario, participants were also asked if PCI would “prevent a heart attack.” Of those who received standard care, 71% endorsed that belief, which is remarkably similar to studies of real patients who have received standard care. In contrast, only 39% of those given specific information and 31% given explanatory information held that belief. Moreover, the belief that PCI prevented MI was the strongest predictor of choosing PCI (odds ratio 5.82, 95% confidence interval 4.13–8.26).25
Interestingly, 52% of the standard care group falsely remembered that the doctor had told them that PCI would prevent an MI, even though the doctor said nothing about it one way or the other. It appears that participants were projecting their own beliefs onto the encounter. This highlights the importance of providing full information to patients who are considering this procedure.
TOWARD SHARED DECISION-MAKING
Shared decision-making is a process in which physicians enter into a partnership with a patient, offer information, elicit the patient’s preferences, and then come to a decision in concert with the patient.
Although many decisions can and should involve elements of shared decision-making, the decision to proceed with PCI for stable angina is particularly well-suited to shared decision-making. This is because the benefit of PCI depends on the value a patient attaches to being free of angina sooner. Since there is no difference in the risk of MI or death, the patient must decide if the risks of the procedure and the inconvenience of taking dual antiplatelet therapy are worth the benefit of improving symptoms faster. Presumably, patients who have more severe symptoms or experienced side effects from antianginal therapy would be more likely to choose PCI.
Despite having substantial experience educating patients, most physicians are unfamiliar with the process of shared decision-making. In particular, the process of eliciting preferences is often overlooked.
To address this issue, researchers at the Mayo Clinic developed a decision aid that compares PCI plus optimal medical therapy vs optimal medical therapy alone in an easily understandable information card.15 On one side, the 2 options are clearly stated, with the magnitude of symptom improvement over time graphically illustrated and the statement, “NO DIFFERENCE in heart attack or death,” prominently displayed. The back of the card discusses the risks of each option in easily understood tables.
The decision aid was compared with standard care in a randomized trial involving patients who were referred for catheterization and possible PCI.26 The decision aid improved patients’ overall knowledge about PCI. In particular, 60% of those who used the decision aid knew that PCI did not prevent death or MI vs 40% of usual-care patients—results similar to those of the online experiment.
Interestingly, the decision about whether to undergo PCI did not differ significantly between the 2 groups, although there was a trend toward more patients in the decision-aid group choosing medical therapy alone (53%) vs the standard-care patients (39%).
To understand why the decision aid did not make more of a difference, the investigators performed qualitative interviews of the cardiologists in the study.27 One theme was the timing of the intervention. Patients using the decision aid had already been referred for catheterization, and some felt the process should have occurred earlier. Engaging in shared decision-making with a general cardiologist before referral could help to improve the quality of patient decisions.
Cardiologists also noted the difficulty in changing their work flow to incorporate the decision aid. Although some embraced the idea of shared decision-making, others were concerned that many patients could not participate, and there was confusion about the difference between an educational tool, which could be used by a patient alone, and a decision aid, which is meant to generate discussion between the doctor and patient. Some expressed interest in using the tool in the future.
These findings serve to emphasize that providing information alone is not enough. If the physician does not “buy in” to the idea of shared decision-making, it will not occur.
PRACTICE IMPLICATIONS
Based on the pathophysiology of coronary artery disease and the results of multiple randomized controlled trials, it is evident that PCI does not prevent heart attacks in patients with chronic stable angina. However, most patients who undergo PCI are unaware of this and therefore do not truly give informed consent. In the absence of explicit information to the contrary, most patients with stable angina assume that PCI prevents MI and thus are biased toward choosing PCI.
Even minimal amounts of explicit information can partially overcome that bias and influence decision-making. In particular, explaining why PCI does not prevent MI was the most effective means of overcoming the bias.
To this end, shared decision aids may help physicians to engage in shared decision-making. Shared decision-making is most likely to occur if physicians are trained in the concept of shared decision-making, are committed to practicing it, and can fit it into their work flow. Ideally, this would occur in the office of a general cardiologist before referral for PCI.
For those practicing in accountable-care organizations, Medicare has recently introduced the shared decision-making model for 6 preference-sensitive conditions, including stable ischemic heart disease. Participants in this program will have the opportunity to receive payments for shared decision-making services and to share in any savings that result from reduced use of resources. Use of these tools holds the promise for providing more patient-centered care at lower cost.
- Jones DS. Visions of a cure. Visualization, clinical trials, and controversies in cardiac therapeutics, 1968–1998. Isis 2000; 91:504–541.
- Hansson G. Inflammation, atherosclerosis, and coronary artery disease. N Engl J Med 2005; 352:1685–1695.
- Stone GW, Maehara A, Lansky AJ, et al. A prospective natural-history study of coronary atherosclerosis. N Engl J Med 2011; 364:226–235.
- Boden WE, O’Rourke RA, Teo KK, et al. Optimal medical therapy with or without PCI for stable coronary disease. N Engl J Med 2007; 356:1503–1516.
- Sedlis SP, Hartigan PM, Teo KK, et al. Effect of PCI on long-term survival in patients with stable ischemic heart disease. N Engl J Med 2015; 373:1937–1946.
- Lin GA, Dudley RA, Lucas FL, Malenka DJ, Vittinghoff E, Redberg RF. Frequency of stress testing to document ischemia prior to elective percutaneous coronary intervention. JAMA 2008; 300:1765–1773.
- Katritsis DG, Ioannidis JP. Percutaneous coronary intervention versus conservative therapy in nonacute coronary artery disease: a meta-analysis. Circulation 2005; 111:2906–2912.
- Trikalinos TA, Alsheikh-Ali AA, Tatsioni A, Nallamothu BK, Kent DM. Percutaneous coronary interventions for non-acute coronary artery disease: a quantitative 20-year synopsis and a network meta-analysis. Lancet 2009; 373:911–918.
- De Bruyne B, Pijls NHJ, Kalesan B, et al. Fractional flow reserve–guided PCI versus medical therapy in stable coronary disease. N Engl J Med 2012; 367:991–1001.
- Weintraub WS, Spertus JA, Kolm P, et al. Effect of PCI on quality of life in patients with stable coronary disease. N Engl J Med 2008; 359:677–687.
- Al-Lamee R, Thompson D, Dehbi H-M, et al, on behalf of the ORBITA Investigators. Percutaneous coronary intervention in stable angina (ORBITA): a double-blind, randomised controlled trial. Lancet. Published online November 2, 2017. http://dx.doi.org/10.1016/S0140-6736(17)32714-9. Accessed November 10, 2017.
- Cobb LA, Thomas GI, Dillard DH, et al. An evaluation of internal mammary-artery ligation by a double-blind technic. N Engl J Med 1959; 260:1115–1118.
- Dimond EG, Fittle F, Crockett JE. Comparison of internal mammary artery ligation and sham operation for angina pectoris. Am J Cardiol 1960; 5:483-486.
- Leon MB, Kornowski R, Downey WE, et al. A blinded, randomized placebo-controlled trial of percutaneous laser myocardial revascularization to improve angina symptoms in patients with severe coronary disease. J Am Coll Cardiol 2005; 46:1812–1819.
- Coylewright M, Shepel K, Leblanc A, et al. Shared decision making in patients with stable coronary artery disease: PCI choice. PLoS One 2012; 7:e49827.
- Holmboe ES, Fiellin DA, Cusanelli E, Remetz M, Krumholz HM. Perceptions of benefit and risk of patients undergoing first-time elective percutaneous coronary revascularization. J Gen Intern Med 2000; 15:632–637.
- Kee F, McDonald P, Gaffney B. Risks and benefits of coronary angioplasty: the patients perspective: a preliminary study. Qual Health Care 1997; 6:131–139.
- Rothberg MB, Sivalingam SK, Ashraf J, et al. Patients’ and cardiologists’ perceptions of the benefits of percutaneous coronary intervention for stable coronary disease. Ann Intern Med 2010; 153:307–313.
- Whittle J, Conigliaro J, Good CB, Kelley ME, Skanderson M. Understanding of the benefits of coronary revascularization procedures among patients who are offered such procedures. Am Heart J 2007; 154:662–668.
- Lin GA, Dudley RA, Redberg RF. Cardiologists’ use of percutaneous coronary interventions for stable coronary artery disease. Arch Intern Med 2007; 167:1604–1609.
- Atreya AR, Sivalingam SK, Arora S, et al. Predictors of medical management in patients undergoing elective cardiac catheterization for chronic ischemic heart disease. Clin Cardiol 2016; 39:207–214.
- Fowler FJ Jr, Gallagher PM, Bynum JP, Barry MJ, Lucas FL, Skinner JS. Decision-making process reported by Medicare patients who had coronary artery stenting or surgery for prostate cancer. J Gen Intern Med 2012; 27:911–916.
- Goff SL, Mazor KM, Ting HH, Kleppel R, Rothberg MB. How cardiologists present the benefits of percutaneous coronary interventions to patients with stable angina: a qualitative analysis. JAMA Intern Med 2014; 174:1614–1621.
- Braddock CH 3rd, Edwards KA, Hasenberg NM, Laidley TL, Levinson W. Informed decision making in outpatient practice: time to get back to basics. JAMA 1999; 282:2313–2320.
- Rothberg MB, Scherer L, Kashef MA, et al. The effect of information presentation on beliefs about the benefits of elective percutaneous coronary intervention. JAMA Intern Med 2014; 174:1623–1629.
- Coylewright M, Dick S, Zmolek B, et al. PCI choice decision aid for stable coronary artery disease: a randomized trial. Circ Cardiovasc Qual Outcomes 2016; 9:767–776.
- Coylewright M, O’Neill ES, Dick S, Grande SW. PCI choice: cardiovascular clinicians’ perceptions of shared decision making in stable coronary artery disease. Patient Educ Couns 2017; 100:1136–1143.
- Jones DS. Visions of a cure. Visualization, clinical trials, and controversies in cardiac therapeutics, 1968–1998. Isis 2000; 91:504–541.
- Hansson G. Inflammation, atherosclerosis, and coronary artery disease. N Engl J Med 2005; 352:1685–1695.
- Stone GW, Maehara A, Lansky AJ, et al. A prospective natural-history study of coronary atherosclerosis. N Engl J Med 2011; 364:226–235.
- Boden WE, O’Rourke RA, Teo KK, et al. Optimal medical therapy with or without PCI for stable coronary disease. N Engl J Med 2007; 356:1503–1516.
- Sedlis SP, Hartigan PM, Teo KK, et al. Effect of PCI on long-term survival in patients with stable ischemic heart disease. N Engl J Med 2015; 373:1937–1946.
- Lin GA, Dudley RA, Lucas FL, Malenka DJ, Vittinghoff E, Redberg RF. Frequency of stress testing to document ischemia prior to elective percutaneous coronary intervention. JAMA 2008; 300:1765–1773.
- Katritsis DG, Ioannidis JP. Percutaneous coronary intervention versus conservative therapy in nonacute coronary artery disease: a meta-analysis. Circulation 2005; 111:2906–2912.
- Trikalinos TA, Alsheikh-Ali AA, Tatsioni A, Nallamothu BK, Kent DM. Percutaneous coronary interventions for non-acute coronary artery disease: a quantitative 20-year synopsis and a network meta-analysis. Lancet 2009; 373:911–918.
- De Bruyne B, Pijls NHJ, Kalesan B, et al. Fractional flow reserve–guided PCI versus medical therapy in stable coronary disease. N Engl J Med 2012; 367:991–1001.
- Weintraub WS, Spertus JA, Kolm P, et al. Effect of PCI on quality of life in patients with stable coronary disease. N Engl J Med 2008; 359:677–687.
- Al-Lamee R, Thompson D, Dehbi H-M, et al, on behalf of the ORBITA Investigators. Percutaneous coronary intervention in stable angina (ORBITA): a double-blind, randomised controlled trial. Lancet. Published online November 2, 2017. http://dx.doi.org/10.1016/S0140-6736(17)32714-9. Accessed November 10, 2017.
- Cobb LA, Thomas GI, Dillard DH, et al. An evaluation of internal mammary-artery ligation by a double-blind technic. N Engl J Med 1959; 260:1115–1118.
- Dimond EG, Fittle F, Crockett JE. Comparison of internal mammary artery ligation and sham operation for angina pectoris. Am J Cardiol 1960; 5:483-486.
- Leon MB, Kornowski R, Downey WE, et al. A blinded, randomized placebo-controlled trial of percutaneous laser myocardial revascularization to improve angina symptoms in patients with severe coronary disease. J Am Coll Cardiol 2005; 46:1812–1819.
- Coylewright M, Shepel K, Leblanc A, et al. Shared decision making in patients with stable coronary artery disease: PCI choice. PLoS One 2012; 7:e49827.
- Holmboe ES, Fiellin DA, Cusanelli E, Remetz M, Krumholz HM. Perceptions of benefit and risk of patients undergoing first-time elective percutaneous coronary revascularization. J Gen Intern Med 2000; 15:632–637.
- Kee F, McDonald P, Gaffney B. Risks and benefits of coronary angioplasty: the patients perspective: a preliminary study. Qual Health Care 1997; 6:131–139.
- Rothberg MB, Sivalingam SK, Ashraf J, et al. Patients’ and cardiologists’ perceptions of the benefits of percutaneous coronary intervention for stable coronary disease. Ann Intern Med 2010; 153:307–313.
- Whittle J, Conigliaro J, Good CB, Kelley ME, Skanderson M. Understanding of the benefits of coronary revascularization procedures among patients who are offered such procedures. Am Heart J 2007; 154:662–668.
- Lin GA, Dudley RA, Redberg RF. Cardiologists’ use of percutaneous coronary interventions for stable coronary artery disease. Arch Intern Med 2007; 167:1604–1609.
- Atreya AR, Sivalingam SK, Arora S, et al. Predictors of medical management in patients undergoing elective cardiac catheterization for chronic ischemic heart disease. Clin Cardiol 2016; 39:207–214.
- Fowler FJ Jr, Gallagher PM, Bynum JP, Barry MJ, Lucas FL, Skinner JS. Decision-making process reported by Medicare patients who had coronary artery stenting or surgery for prostate cancer. J Gen Intern Med 2012; 27:911–916.
- Goff SL, Mazor KM, Ting HH, Kleppel R, Rothberg MB. How cardiologists present the benefits of percutaneous coronary interventions to patients with stable angina: a qualitative analysis. JAMA Intern Med 2014; 174:1614–1621.
- Braddock CH 3rd, Edwards KA, Hasenberg NM, Laidley TL, Levinson W. Informed decision making in outpatient practice: time to get back to basics. JAMA 1999; 282:2313–2320.
- Rothberg MB, Scherer L, Kashef MA, et al. The effect of information presentation on beliefs about the benefits of elective percutaneous coronary intervention. JAMA Intern Med 2014; 174:1623–1629.
- Coylewright M, Dick S, Zmolek B, et al. PCI choice decision aid for stable coronary artery disease: a randomized trial. Circ Cardiovasc Qual Outcomes 2016; 9:767–776.
- Coylewright M, O’Neill ES, Dick S, Grande SW. PCI choice: cardiovascular clinicians’ perceptions of shared decision making in stable coronary artery disease. Patient Educ Couns 2017; 100:1136–1143.
KEY POINTS
- For patients with stable angina pectoris, PCI does not prevent myocardial infarction or death.
- Optimal medical therapy with aspirin and a statin can reduce the risk of myocardial infarction and should be recommended for all patients with stable angina, regardless of whether they undergo PCI.
- PCI improves symptoms of angina faster than medical therapy alone, but more than half of patients will be free of angina in about 2 years with either option.
- In the absence of information to the contrary, most patients and some doctors assume that PCI is life-saving and are biased towards choosing it. As a result, patients are rarely able to give true informed consent to undergo PCI.
Navigating the anticoagulant landscape in 2017
This article reviews recommendations and evidence concerning current anticoagulant management for venous thromboembolism and perioperative care, with an emphasis on individualizing treatment for real-world patients.
TREATING ACUTE VENOUS THROMBOEMBOLISM
Case 1: Deep vein thrombosis in an otherwise healthy man
A 40-year-old man presents with 7 days of progressive right leg swelling. He has no antecedent risk factors for deep vein thrombosis or other medical problems. Venous ultrasonography reveals an iliofemoral deep vein thrombosis. How should he be managed?
- Outpatient treatment with low-molecular-weight heparin for 4 to 6 days plus warfarin
- Outpatient treatment with a direct oral anticoagulant, ie, apixaban, dabigatran (which requires 4 to 6 days of initial treatment with low-molecular-weight heparin), or rivaroxaban
- Catheter-directed thrombolysis followed by low-molecular-weight heparin, then warfarin or a direct oral anticoagulant
- Inpatient intravenous heparin for 7 to 10 days, then warfarin or a direct oral anticoagulant
All of these are acceptable for managing acute venous thromboembolism, but the clinician’s role is to identify which treatment is most appropriate for an individual patient.
Deep vein thrombosis is not a single condition
Multiple guidelines exist to help decide on a management strategy. Those of the American College of Chest Physicians (ACCP)1 are used most often.
That said, guidelines are established for “average” patients, so it is important to look beyond guidelines and individualize management. Venous thromboembolism is not a single entity; it has a myriad of clinical presentations that could call for different treatments. Most patients have submassive deep vein thrombosis or pulmonary embolism, which is not limb-threatening nor associated with hemodynamic instability. It can also differ in terms of etiology and can be unprovoked (or idiopathic), cancer-related, catheter-associated, or provoked by surgery or immobility.
Deep vein thrombosis has a wide spectrum of presentations. It can involve the veins of the calf only, or it can involve the femoral and iliac veins and other locations including the splanchnic veins, the cerebral sinuses, and upper extremities. Pulmonary embolism can be massive (defined as being associated with hemodynamic instability or impending respiratory failure) or submassive. Similarly, patients differ in terms of baseline medical conditions, mobility, and lifestyle. Anticoagulant management decisions should take all these factors into account.
Consider clot location
Our patient with iliofemoral deep vein thrombosis is best managed differently than a more typical patient with less extensive thrombosis that would involve the popliteal or femoral vein segments, or both. A clot that involves the iliac vein is more likely to lead to postthrombotic chronic pain and swelling as the lack of venous outflow bypass channels to circumvent the clot location creates higher venous pressure within the affected leg. Therefore, for our patient, catheter-directed thrombolysis is an option that should be considered.
Catheter-directed thrombolysis trials
According to the “open-vein hypothesis,” quickly eliminating the thrombus and restoring unobstructed venous flow may mitigate the risk not only of recurrent thrombosis, but also of postthrombotic syndrome, which is often not given much consideration acutely but can cause significant, life-altering chronic disability.
The “valve-integrity hypothesis” is also important; it considers whether lytic therapy may help prevent damage to such valves in an attempt to mitigate the amount of venous hypertension.
Thus, catheter-directed thrombolysis offers theoretical benefits, and recent trials have assessed it against standard anticoagulation treatments.
The CaVenT trial (Catheter-Directed Venous Thrombolysis),2 conducted in Norway, randomized 209 patients with midfemoral to iliac deep vein thrombosis to conventional treatment (anticoagulation alone) or anticoagulation plus catheter-directed thrombolysis. At 2 years, postthrombotic syndrome had occurred in 41% of the catheter-directed thrombolysis group compared with 56% of the conventional treatment group (P = .047). At 5 years, the difference widened to 43% vs 71% (P < .01, number needed to treat = 4).3 Despite the superiority of lytic therapy, the incidence of postthrombotic syndrome remained high in patients who received this treatment.
The ATTRACT trial (Acute Venous Thrombosis: Thrombus Removal With Adjunctive Catheter-Directed Thrombolysis),4 a US multicenter, open-label, assessor-blind study, randomized 698 patients with femoral or more-proximal deep vein thrombosis to either standard care (anticoagulant therapy and graduated elastic compression stockings) or standard care plus catheter-directed thrombolysis. In preliminary results presented at the Society of Interventional Radiology meeting in March 2017, although no difference was found in the primary outcome (postthrombotic syndrome at 24 months), catheter-directed thrombolysis for iliofemoral deep vein thrombosis led to a 25% reduction in moderate to severe postthrombotic syndrome.
Although it is too early to draw conclusions before publication of the ATTRACT study, the preliminary results highlight the need to individualize treatment and to be selective about using catheter-directed thrombolysis. The trials provide reassurance that catheter-directed lysis is a reasonable and safe intervention when performed by physicians experienced in the procedure. The risk of major bleeding appears to be low (about 2%) and that for intracranial hemorrhage even lower (< 0.5%).
Catheter-directed thrombolysis is appropriate in some cases
The 2016 ACCP guidelines1 recommend anticoagulant therapy alone over catheter-directed thrombolysis for patients with acute proximal deep vein thrombosis of the leg. However, it is a grade 2C (weak) recommendation.
They provide no specific recommendation as to the clinical indications for catheter-directed thrombolysis, but identify patients who would be most likely to benefit, ie, those who have:
- Iliofemoral deep vein thrombosis
- Symptoms for less than 14 days
- Good functional status
- Life expectancy of more than 1 year
- Low risk of bleeding.
Our patient satisfies these criteria, suggesting that catheter-directed thrombolysis is a reasonable option for him.
Timing is important. Catheter-directed lysis is more likely to be beneficial if used before fibrin deposits form and stiffen the venous valves, causing irreversible damage that leads to postthrombotic syndrome.
Role of direct oral anticoagulants
The availability of direct oral anticoagulants has generated interest in defining their therapeutic role in patients with venous thromboembolism.
In a meta-analysis5 of major trials comparing direct oral anticoagulants and vitamin K antagonists such as warfarin, no significant difference was found for the risk of recurrent venous thromboembolism or venous thromboembolism-related deaths. However, fewer patients experienced major bleeding with direct oral anticoagulants (relative risk 0.61, P = .002). Although significant, the absolute risk reduction was small; the incidence of major bleeding was 1.1% with direct oral anticoagulants vs 1.8% with vitamin K antagonists.
The main advantage of direct oral anticoagulants is greater convenience for the patient.
WHICH PATIENTS ON WARFARIN NEED BRIDGING PREOPERATIVELY?
Many patients still take warfarin, particularly those with atrial fibrillation, a mechanical heart valve, or venous thromboembolism. In many countries, warfarin remains the dominant anticoagulant for stroke prevention. Whether these patients need heparin during the period of perioperative warfarin interruption is a frequently encountered scenario that, until recently, was controversial. Recent studies have helped to inform the need for heparin bridging in many of these patients.
Case 2: An elderly woman on warfarin facing cancer surgery
A 75-year-old woman weighing 65 kg is scheduled for elective colon resection for incidentally found colon cancer. She is taking warfarin for atrial fibrillation. She also has hypertension and diabetes and had a transient ischemic attack 10 years ago.
One doctor told her she needs to be assessed for heparin bridging, but another told her she does not need bridging.
The default management should be not to bridge patients who have atrial fibrillation, but to consider bridging in selected patients, such as those with recent stroke or transient ischemic attack or a prior thromboembolic event during warfarin interruption. However, decisions about bridging should not be made on the basis of the CHADS2 score alone. For the patient described here, I would recommend not bridging.
Complex factors contribute to stroke risk
Stroke risk for patients with atrial fibrillation can be quickly estimated with the CHADS2 score, based on:
- Congestive heart failure (1 point)
- Hypertension (1 point)
- Age at least 75 (1 point)
- Diabetes (1 point)
- Stroke or transient ischemic attack (2 points).
Our patient has a score of 5, corresponding to an annual adjusted stroke risk of 12.5%. Whether her transient ischemic attack of 10 years ago is comparable in significance to a recent stroke is debatable and highlights a weakness of clinical prediction rules. Moreover, such prediction scores were developed to estimate the long-term risk of stroke if anticoagulants are not given, and they have not been assessed in a perioperative setting where there is short-term interruption of anticoagulants. Also, the perioperative milieu is associated with additional factors not captured in these clinical prediction rules that may affect the risk of stroke.
Thus, the risk of perioperative stroke likely involves the interplay of multiple factors, including the type of surgery the patient is undergoing. Some factors may be mitigated:
- Rebound hypercoagulability after stopping an oral anticoagulant can be prevented by intraoperative blood pressure and volume control
- Elevated biochemical factors (eg, D-dimer, B-type natriuretic peptide, troponin) may be lowered with perioperative aspirin therapy
- Lipid and genetic factors may be mitigated with perioperative statin use.
Can heparin bridging also mitigate the risk?
Bridging in patients with atrial fibrillation
Most patients who are taking warfarin are doing so because of atrial fibrillation, so most evidence about perioperative bridging was developed in such patients.
The BRIDGE trial (Bridging Anticoagulation in Patients Who Require Temporary Interruption of Warfarin Therapy for an Elective Invasive Procedure or Surgery)6 was the first randomized controlled trial to compare a bridging and no-bridging strategy for patients with atrial fibrillation who required warfarin interruption for elective surgery. Nearly 2,000 patients were given either low-molecular-weight heparin or placebo starting 3 days before until 24 hours before a procedure, and then for 5 to 10 days afterwards. For all patients, warfarin was stopped 5 days before the procedure and was resumed within 24 hours afterwards.
A no-bridging strategy was noninferior to bridging: the risk of perioperative arterial thromboembolism was 0.4% without bridging vs 0.3% with bridging (P = .01 for noninferiority). In addition, a no-bridging strategy conferred a lower risk of major bleeding than bridging: 1.3% vs 3.2% (relative risk 0.41, P = .005 for superiority).
Although the difference in absolute bleeding risk was small, bleeding rates were lower than those seen outside of clinical trials, as the bridging protocol used in BRIDGE was designed to minimize the risk of bleeding. Also, although only 5% of patients had a CHADS2 score of 5 or 6, such patients are infrequent in clinical practice, and BRIDGE did include a considerable proportion (17%) of patients with a prior stroke or transient ischemic attack who would be considered at high risk.
Other evidence about heparin bridging is derived from observational studies, more than 10 of which have been conducted. In general, they have found that not bridging is associated with low rates of arterial thromboembolism (< 0.5%) and that bridging is associated with high rates of major bleeding (4%–7%).7–12
Bridging in patients with a mechanical heart valve
Warfarin is the only anticoagulant option for patients who have a mechanical heart valve. No randomized controlled trials have evaluated the benefits of perioperative bridging vs no bridging in this setting.
Observational (cohort) studies suggest that the risk of perioperative arterial thromboembolism is similar with or without bridging anticoagulation, although most patients studied were bridged and those not bridged were considered at low risk (eg, with a bileaflet aortic valve and no additional risk factors).13 However, without stronger evidence from randomized controlled trials, bridging should be the default management for patients with a mechanical heart valve. In our practice, we bridge most patients who have a mechanical heart valve unless they are considered to be at low risk, such as those who have a bileaflet aortic valve.
Bridging in patients with prior venous thromboembolism
Even less evidence is available for periprocedural management of patients who have a history of venous thromboembolism. No randomized controlled trials exist evaluating bridging vs no bridging. In 1 cohort study in which more than 90% of patients had had thromboembolism more than 3 months before the procedure, the rate of recurrent venous thromboembolism without bridging was less than 0.5%.14
It is reasonable to bridge patients who need anticoagulant interruption within 3 months of diagnosis of a deep vein thrombosis or pulmonary embolism, and to consider using a temporary inferior vena cava filter for patients who have had a clot who need treatment interruption during the initial 3 to 4 weeks after diagnosis.
Practice guidelines: Perioperative anticoagulation
Guidance for preoperative and postoperative bridging for patients taking warfarin is summarized in Table 2.
CARDIAC PROCEDURES
For patients facing a procedure to implant an implantable cardioverter-defibrillator (ICD) or pacemaker, a procedure-specific concern is the avoidance of pocket hematoma.
Patients on warfarin: Do not bridge
The BRUISE CONTROL-1 trial (Bridge or Continue Coumadin for Device Surgery Randomized Controlled Trial)19 randomized patients undergoing pacemaker or ICD implantation to either continued anticoagulation therapy and not bridging (ie, continued warfarin so long as the international normalized ratio was < 3) vs conventional bridging treatment (ie, stopping warfarin and bridging with low-molecular-weight heparin). A clinically significant device-pocket hematoma occurred in 3.5% of the continued-warfarin group vs 16.0% in the heparin-bridging group (P < .001). Thromboembolic complications were rare, and rates did not differ between the 2 groups.
Results of the BRUISE CONTROL-1 trial serve as a caution to at least not be too aggressive with bridging. The study design involved resuming heparin 24 hours after surgery, which is perhaps more aggressive than standard practice. In our practice, we wait at least 24 hours to reinstate heparin after minor surgery, and 48 to 72 hours after surgery with higher bleeding risk.
These results are perhaps not surprising if one considers how carefully surgeons try to control bleeding during surgery for patients taking anticoagulants. For patients who are not on an anticoagulant, small bleeding may be less of a concern during a procedure. When high doses of heparin are introduced soon after surgery, small concerns during surgery may become big problems afterward.
Based on these results, it is reasonable to undertake device implantation without interruption of a vitamin K antagonist such as warfarin.
Patients on direct oral anticoagulants: The jury is still out
The similar BRUISE CONTROL-2 trial is currently under way, comparing interruption vs continuation of dabigatran for patients undergoing cardiac device surgery.
In Europe, surgeons are less concerned than those in the United States about operating while a patient is on anticoagulant therapy. But the safety of this practice is not backed by strong evidence.
Direct oral anticoagulants: Consider pharmacokinetics
Direct oral anticoagulants are potent and fast-acting, with a peak effect 1 to 3 hours after intake. This rapid anticoagulant action is similar to that of bridging with low-molecular-weight heparin, and caution is needed when administering direct oral anticoagulants, especially after major surgery or surgery with a high bleeding risk.
Frost et al20 compared the pharmacokinetics of apixaban (with twice-daily dosing) and rivaroxaban (once-daily dosing) and found that peak anticoagulant activity is faster and higher with rivaroxaban. This is important, because many patients will take their anticoagulant first thing in the morning. Consequently, if patients require any kind of procedure (including dental), they should skip the morning dose of the direct oral anticoagulant to avoid having the procedure done during the peak anticoagulant effect, and they should either not take that day’s dose or defer the dose until the evening after the procedure.
MANAGING SURGERY FOR PATIENTS ON A DIRECT ORAL ANTICOAGULANT
Case 3: An elderly woman on apixaban facing surgery
Let us imagine that our previous patient takes apixaban instead of warfarin. She is 75 years old, has atrial fibrillation, and is about to undergo elective colon resection for cancer. One doctor advises her to simply stop apixaban for 2 days, while another says she should go off apixaban for 5 days and will need bridging. Which plan is best?
In the perioperative setting, our goal is to interrupt patients’ anticoagulant therapy for the shortest time that results in no residual anticoagulant effect at the time of the procedure.
They further recommend that if the risk of venous thromboembolism is high, low-molecular-weight heparin bridging should be done while stopping the direct oral anticoagulant, with the heparin discontinued 24 hours before the procedure. This recommendation seems counterintuitive, as it is advising replacing a short-acting anticoagulant with low-molecular-weight heparin, another short-acting anticoagulant.
The guidelines committee was unable to provide strength and grading of their recommendations, as too few well-designed studies are available to support them. The doctor in case 3 who advised stopping apixaban for 5 days and bridging is following the guidelines, but without much evidence to support this strategy.
Is bridging needed during interruption of a direct oral anticoagulant?
There are no randomized, controlled trials of bridging vs no bridging in patients taking direct oral anticoagulants. Substudies exist of patients taking these drugs for atrial fibrillation who had treatment interrupted for procedures, but the studies did not randomize bridging vs no bridging, nor were bridging regimens standardized. Three of the four atrial fibrillation trials had a blinded design (warfarin vs direct oral anticoagulants), making perioperative management difficult, as physicians did not know the pharmacokinetics of the drugs their patients were taking.22–24
We used the database from the Randomized Evaluation of Long-Term Anticoagulation Therapy (RE-LY) trial22 to evaluate bridging in patients taking either warfarin or dabigatran. With an open-label study design (the blinding was only for the 110 mg and 150 mg dabigatran doses), clinicians were aware of whether patients were receiving warfarin or dabigatran, thereby facilitating perioperative management. Among dabigatran-treated patients, those who were bridged had significantly more major bleeding than those not bridged (6.5% vs 1.8%, P < .001), with no difference between the groups for stroke or systemic embolism. Although it is not a randomized controlled trial, it does provide evidence that bridging may not be advisable for patients taking a direct oral anticoagulant.
The 2017 American College of Cardiology guidelines25 conclude that parenteral bridging is not indicated for direct oral anticoagulants. Although this is not based on strong evidence, the guidance appears reasonable according to the evidence at hand.
The 2017 American Heart Association Guidelines16 recommend a somewhat complex approach based on periprocedural bleeding risk and thromboembolic risk.
How long to interrupt direct oral anticoagulants?
Evidence for this approach comes from a prospective cohort study27 of 541 patients being treated with dabigatran who were having an elective surgery or invasive procedure. Patients received standard perioperative management, with the timing of the last dabigatran dose before the procedure (24 hours, 48 hours, or 96 hours) based on the bleeding risk of surgery and the patient’s creatinine clearance. Dabigatran was resumed 24 to 72 hours after the procedure. No heparin bridging was done. Patients were followed for up to 30 days postoperatively. The results were favorable with few complications: one transient ischemic attack (0.2%), 10 major bleeding episodes (1.8%), and 28 minor bleeding episodes (5.2%).
A subgroup of 181 patients in this study28 had a plasma sample drawn just before surgery, allowing the investigators to assess the level of coagulation factors after dabigatran interruption. Results were as follows:
- 93% had a normal prothrombin time
- 80% had a normal activated partial thromboplastin time
- 33% had a normal thrombin time
- 81% had a normal dilute thrombin time.
The dilute thrombin time is considered the most reliable test of the anticoagulant effect of dabigatran but is not widely available. The activated partial thromboplastin time can provide a more widely used coagulation test to assess (in a less precise manner) whether there is an anticoagulant effect of dabigatran present, and more sensitive activated partial thromboplastin time assays can be used to better detect any residual dabigatran effect.
Dabigatran levels were also measured. Although 66% of patients had low drug levels just before surgery, the others still had substantial dabigatran on board. The fact that bleeding event rates were so low in this study despite the presence of dabigatran in many patients raises the question of whether having some drug on board is a good predictor of bleeding risk.
An interruption protocol with a longer interruption interval—12 to 14 hours longer than in the previous study (3 days for high-bleed risk procedures, 2 days for low-bleed risk procedures)—brought the activated partial thromboplastin time and dilute thrombin time to normal levels for 100% of patients with the protocol for high-bleeding-risk surgery. This study was based on small numbers and its interruption strategy needs further investigation.29
Case 3 continued
The PAUSE study (NCT02228798), a multicenter, prospective cohort study, is designed to establish a safe, standardized protocol for the perioperative management of patients with atrial fibrillation taking dabigatran, rivaroxaban, or apixaban and will include 3,300 patients.
PATIENTS WITH A CORONARY STENT WHO NEED SURGERY
Case 4: A woman with a stent facing surgery
A 70-year-old woman needs breast cancer resection. She has coronary artery disease and had a drug-eluting stent placed 5 months ago after elective cardiac catheterization. She also has hypertension, obesity, and type 2 diabetes. Her medications include an angiotensin II receptor blocker, hydrochlorothiazide, insulin, and an oral hypoglycemic. She is also taking aspirin 81 mg daily and ticagrelor (a P2Y12 receptor antagonist) 90 mg twice daily.
Her cardiologist is concerned that stopping antiplatelet therapy could trigger acute stent thrombosis, which has a 50% or higher mortality rate.
Should she stop taking aspirin before surgery? What about the ticagrelor?
Is aspirin safe during surgery?
Evidence concerning aspirin during surgery comes from Perioperative Ischemic Evaluation 2 (POISE-2), a double-blind, randomized controlled trial.30 Patients who had known cardiovascular disease or risk factors for cardiovascular disease and were about to undergo noncardiac surgery were stratified according to whether they had been taking aspirin before the study (patients taking aspirin within 72 hours of the surgery were excluded from randomization). Participants in each group were randomized to take either aspirin or placebo just before surgery. The primary outcome was the combined rate of death or nonfatal myocardial infarction 30 days after randomization.
The study found no differences in the primary end point between the two groups. However, major bleeding occurred significantly more often in the aspirin group (4.6% vs 3.8%, hazard ratio 1.2, 95% confidence interval 1.0–1.5).
Moreover, only 4% of the patients in this trial had a cardiac stent. The trial excluded patients who had had a bare-metal stent placed within 6 weeks or a drug-eluting stent placed within 1 year, so it does not help us answer whether aspirin should be stopped for our current patient.
Is surgery safe for patients with stents?
The safety of undergoing surgery with a stent was investigated in a large US Veterans Administration retrospective cohort study.31 More than 20,000 patients with stents who underwent noncardiac surgery within 2 years of stent placement were compared with a control group of more than 41,000 patients with stents who did not undergo surgery. Patients were matched by stent type and cardiac risk factors at the time of stent placement.
The risk of an adverse cardiac event in both the surgical and nonsurgical cohorts was highest in the initial 6 weeks after stent placement and plateaued 6 months after stent placement, when the risk difference between the surgical and nonsurgical groups leveled off to 1%.
The risk of a major adverse cardiac event postoperatively was much more dependent on the timing of stent placement in complex and inpatient surgeries. For outpatient surgeries, the risk of a major cardiac event was very low and the timing of stent placement did not matter.
A Danish observational study32 compared more than 4,000 patients with drug-eluting stents having surgery to more than 20,000 matched controls without coronary heart disease having similar surgery. The risk of myocardial infarction or cardiac death was much higher for patients undergoing surgery within 1 month after drug-eluting stent placement compared with controls without heart disease and patients with stent placement longer than 1 month before surgery.
Our practice is to continue aspirin for surgery in patients with coronary stents regardless of the timing of placement. Although there is a small increased risk of bleeding, this must be balanced against thrombotic risk. We typically stop clopidogrel 5 to 7 days before surgery and ticagrelor 3 to 5 days before surgery. We may decide to give platelets before very-high-risk surgery (eg, intracranial, spinal) if there is a decision to continue both antiplatelet drugs—for example, in a patient who recently received a drug-eluting stent (ie, within 3 months). It is essential to involve the cardiologist and surgeon in these decisions.
BOTTOM LINE
- Kearon C, Aki EA, Ornelas J, et al. Antithrombotic therapy for VTE disease: CHEST guideline and expert panel report. Chest 2016; 149:315–352.
- Enden T, Haig Y, Klow NE, et al; CaVenT Study Group. Long-term outcome after additional catheter-directed thrombolysis versus standard treatment for acute iliofemoral deep vein thrombosis (the CaVenT study): a randomised controlled trial. Lancet 2012; 379:31–38.
- Haig Y, Enden T, Grotta O, et al; CaVenT Study Group. Post-thrombotic syndrome after catheter-directed thrombolysis for deep vein thrombosis (CaVenT): 5-year follow-up results of an open-label, randomized controlled trial. Lancet Haematol 2016; 3:e64–e71.
- Vedantham S, Goldhaber SZ, Kahn SR, et al. Rationale and design of the ATTRACT Study: a multicenter randomized trial to evaluate pharmacomechanical catheter-directed thrombolysis for the prevention of postthrombotic syndrome in patients with proximal deep vein thrombosis. Am Heart J 2013; 165:523–530.
- Van Es N, Coppens M, Schulman S, Middeldorp S, Buller HR. Direct oral anticoagulants compared with vitamin K antagonists for acute venous thromboembolism: evidence from phase 3 trials. Blood 2014; 124:1968–1975.
- Douketis JD, Spyropoulos AC, Kaatz S, et al; BRIDGE Investigators. Perioperative bridging anticoagulation in patients with atrial fibrillation. N Engl J Med 2015; 373:823–833.
- Douketis J, Johnson JA, Turpie AG. Low-molecular-weight heparin as bridging anticoagulation during interruption of warfarin: assessment of a standardized periprocedural anticoagulation regimen. Arch Intern Med 2004; 164:1319–1326.
- Dunn AS, Spyropoulos AC, Turpie AG. Bridging therapy in patients on long-term oral anticoagulants who require surgery: the Prospective Peri-operative Enoxaparin Cohort Trial (PROSPECT). J Thromb Haemost 2007; 5:2211–2218.
- Kovacs MJ, Kearon C, Rodger M, et al. Single-arm study of bridging therapy with low-molecular-weight heparin for patients at risk of arterial embolism who require temporary interruption of warfarin. Circulation 2004; 110:1658–1663.
- Spyropoulos AC, Turpie AG, Dunn AS, et al; REGIMEN Investigators. Clinical outcomes with unfractionated heparin or low-molecular-weight heparin as bridging therapy in patients on long-term oral anticoagulants: the REGIMEN registry. J Thromb Haemost 2006; 4:1246–1252.
- Douketis JD, Woods K, Foster GA, Crowther MA. Bridging anticoagulation with low-molecular-weight heparin after interruption of warfarin therapy is associated with a residual anticoagulant effect prior to surgery. Thromb Haemost 2005; 94:528–531.
- Schulman S, Hwang HG, Eikelboom JW, Kearon C, Pai M, Delaney J. Loading dose vs. maintenance dose of warfarin for reinitiation after invasive procedures: a randomized trial. J Thromb Haemost 2014; 12:1254-1259.
- Siegal D, Yudin J, Kaatz S, Douketis JD, Lim W, Spyropoulos AC. Periprocedural heparin bridging in patients receiving vitamin K antagonists: systematic review and meta-analysis of bleeding and thromboembolic rates. Circulation 2012; 126:1630–1639.
- Skeith L, Taylor J, Lazo-Langner A, Kovacs MJ. Conservative perioperative anticoagulation management in patients with chronic venous thromboembolic disease: a cohort study. J Thromb Haemost 2012; 10:2298–2304.
- Douketis JD, Spyropoulos AC, Spencer FA, et al. Perioperative management of antithrombotic therapy: Antithrombotic Therapy and Prevention of Thrombosis, 9th ed: American College of Chest Physicians evidence-based clinical practice guidelines. Chest 2012; 141(2 suppl):e326S–e350S.
- Doherty JU, Gluckman TJ, Hucker WJ, et al. 2017 ACC expert consensus decision pathway for periprocedural management of anticoagulation in patients with nonvalvular atrial fibrillation: a report of the American College of Cardiology Clinical Expert Consensus Document Task Force. J Am Coll Cardiol 2017; 69:871–898.
- Raval AN, Cigarroa JE, Chung MK, et al; American Heart Association Clinical Pharmacology Subcommittee of the Acute Cardiac Care and General Cardiology Committee of the Council on Clinical Cardiology; Council on Cardiovascular Disease in the Young; and Council on Quality of Care and Outcomes Research. Management of patients on non-vitamin K antagonist oral anticoagulants in the acute care and periprocedural setting: a scientific statement from the American Heart Association. Circulation 2017; 135:e604–e633.
- Tafur A, Douketis J. Perioperative anticoagulant management in patients with atrial fibrillation: practical implications of recent clinical trials. Pol Arch Med Wewn 2015; 125:666–671.
- Birnie DH, Healey JS, Wells GA, et al: BRUISE CONTROL Investigators. Pacemaker or defibrillator surgery without interruption of anticoagulation. N Engl J Med 2013; 368:2084–2093.
- Frost C, Song Y, Barrett YC, et al. A randomized direct comparison of the pharmacokinetics and pharmacodynamics of apixaban and rivaroxaban. Clin Pharmacol 2014; 6:179–187.
- Narouze S, Benzon HT, Provenzano DA, et al. Interventional spine and pain procedures in patients on antiplatelet and anticoagulant medications: guidelines from the American Society of Regional Anesthesia and Pain Medicine, the European Society of Regional Anesthesia and Pain Therapy, the American Academy of Pain Medicine, the International Neuromodulation Society, the North American Neuromodulation Society, and the World institute of Pain. Reg Anesth Pain Med 2015; 40:182–212.
- Douketis JD, Healey JS, Brueckmann M, et al. Perioperative bridging anticoagulation during dabigatran or warfarin interruption among patients who had an elective surgery or procedure. Substudy of the RE-LY trial. Thromb Haemost 2015; 113:625–632.
- Steinberg BA, Peterson ED, Kim S, et al; Outcomes Registry for Better Informed Treatment of Atrial Fibrillation Investigators and Patients. Use and outcomes associated with bridging during anticoagulation interruptions in patients with atrial fibrillation: findings from the Outcomes Registry for Better Informed Treatment of Atrial Fibrillation (ORBIT-AF). Circulation 2015; 131:488–494.
- Garcia D, Alexander JH, Wallentin L, et al. Management and clinical outcomes in patients treated with apixaban vs warfarin undergoing procedures. Blood 2014; 124:3692–3698.
- Doherty JU, Gluckman TJ, Hucker WJ, et al. 2017 ACC expert consensus decision pathway for periprocedural management of anticoagulation in patients with nonvalvular atrial fibrillation: a report of the American College of Cardiology Clinical Expert Consensus Document Task Force. J Am Coll Cardiol 2017; 69:871–898.
- Thrombosis Canada. NOACs/DOACs: Peri-operative management. http://thrombosiscanada.ca/?page_id=18#. Accessed August 30, 2017.
- Schulman S, Carrier M, Lee AY, et al; Periop Dabigatran Study Group. Perioperative management of dabigatran: a prospective cohort study. Circulation 2015; 132:167–173.
- Douketis JD, Wang G, Chan N, et al. Effect of standardized perioperative dabigatran interruption on the residual anticoagulation effect at the time of surgery or procedure. J Thromb Haemost 2016; 14:89–97.
- Douketis JD, Syed S, Schulman S. Periprocedural management of direct oral anticoagulants: comment on the 2015 American Society of Regional Anesthesia and Pain Medicine guidelines. Reg Anesth Pain Med 2016; 41:127–129.
- Devereaux PJ, Mrkobrada M, Sessler DI, et al; POISE-2 Investigators. Aspirin in patients undergoing noncardiac surgery. N Engl J Med 2014; 370:1494–1503.
- Holcomb CN, Graham LA, Richman JS, et al. The incremental risk of noncardiac surgery on adverse cardiac events following coronary stenting. J Am Coll Cardiol 2014; 64:2730–2739.
- Egholm G, Kristensen SD, Thim T, et al. Risk associated with surgery within 12 months after coronary drug-eluting stent implantation. J Am Coll Cardiol 2016; 68:2622–2632.
This article reviews recommendations and evidence concerning current anticoagulant management for venous thromboembolism and perioperative care, with an emphasis on individualizing treatment for real-world patients.
TREATING ACUTE VENOUS THROMBOEMBOLISM
Case 1: Deep vein thrombosis in an otherwise healthy man
A 40-year-old man presents with 7 days of progressive right leg swelling. He has no antecedent risk factors for deep vein thrombosis or other medical problems. Venous ultrasonography reveals an iliofemoral deep vein thrombosis. How should he be managed?
- Outpatient treatment with low-molecular-weight heparin for 4 to 6 days plus warfarin
- Outpatient treatment with a direct oral anticoagulant, ie, apixaban, dabigatran (which requires 4 to 6 days of initial treatment with low-molecular-weight heparin), or rivaroxaban
- Catheter-directed thrombolysis followed by low-molecular-weight heparin, then warfarin or a direct oral anticoagulant
- Inpatient intravenous heparin for 7 to 10 days, then warfarin or a direct oral anticoagulant
All of these are acceptable for managing acute venous thromboembolism, but the clinician’s role is to identify which treatment is most appropriate for an individual patient.
Deep vein thrombosis is not a single condition
Multiple guidelines exist to help decide on a management strategy. Those of the American College of Chest Physicians (ACCP)1 are used most often.
That said, guidelines are established for “average” patients, so it is important to look beyond guidelines and individualize management. Venous thromboembolism is not a single entity; it has a myriad of clinical presentations that could call for different treatments. Most patients have submassive deep vein thrombosis or pulmonary embolism, which is not limb-threatening nor associated with hemodynamic instability. It can also differ in terms of etiology and can be unprovoked (or idiopathic), cancer-related, catheter-associated, or provoked by surgery or immobility.
Deep vein thrombosis has a wide spectrum of presentations. It can involve the veins of the calf only, or it can involve the femoral and iliac veins and other locations including the splanchnic veins, the cerebral sinuses, and upper extremities. Pulmonary embolism can be massive (defined as being associated with hemodynamic instability or impending respiratory failure) or submassive. Similarly, patients differ in terms of baseline medical conditions, mobility, and lifestyle. Anticoagulant management decisions should take all these factors into account.
Consider clot location
Our patient with iliofemoral deep vein thrombosis is best managed differently than a more typical patient with less extensive thrombosis that would involve the popliteal or femoral vein segments, or both. A clot that involves the iliac vein is more likely to lead to postthrombotic chronic pain and swelling as the lack of venous outflow bypass channels to circumvent the clot location creates higher venous pressure within the affected leg. Therefore, for our patient, catheter-directed thrombolysis is an option that should be considered.
Catheter-directed thrombolysis trials
According to the “open-vein hypothesis,” quickly eliminating the thrombus and restoring unobstructed venous flow may mitigate the risk not only of recurrent thrombosis, but also of postthrombotic syndrome, which is often not given much consideration acutely but can cause significant, life-altering chronic disability.
The “valve-integrity hypothesis” is also important; it considers whether lytic therapy may help prevent damage to such valves in an attempt to mitigate the amount of venous hypertension.
Thus, catheter-directed thrombolysis offers theoretical benefits, and recent trials have assessed it against standard anticoagulation treatments.
The CaVenT trial (Catheter-Directed Venous Thrombolysis),2 conducted in Norway, randomized 209 patients with midfemoral to iliac deep vein thrombosis to conventional treatment (anticoagulation alone) or anticoagulation plus catheter-directed thrombolysis. At 2 years, postthrombotic syndrome had occurred in 41% of the catheter-directed thrombolysis group compared with 56% of the conventional treatment group (P = .047). At 5 years, the difference widened to 43% vs 71% (P < .01, number needed to treat = 4).3 Despite the superiority of lytic therapy, the incidence of postthrombotic syndrome remained high in patients who received this treatment.
The ATTRACT trial (Acute Venous Thrombosis: Thrombus Removal With Adjunctive Catheter-Directed Thrombolysis),4 a US multicenter, open-label, assessor-blind study, randomized 698 patients with femoral or more-proximal deep vein thrombosis to either standard care (anticoagulant therapy and graduated elastic compression stockings) or standard care plus catheter-directed thrombolysis. In preliminary results presented at the Society of Interventional Radiology meeting in March 2017, although no difference was found in the primary outcome (postthrombotic syndrome at 24 months), catheter-directed thrombolysis for iliofemoral deep vein thrombosis led to a 25% reduction in moderate to severe postthrombotic syndrome.
Although it is too early to draw conclusions before publication of the ATTRACT study, the preliminary results highlight the need to individualize treatment and to be selective about using catheter-directed thrombolysis. The trials provide reassurance that catheter-directed lysis is a reasonable and safe intervention when performed by physicians experienced in the procedure. The risk of major bleeding appears to be low (about 2%) and that for intracranial hemorrhage even lower (< 0.5%).
Catheter-directed thrombolysis is appropriate in some cases
The 2016 ACCP guidelines1 recommend anticoagulant therapy alone over catheter-directed thrombolysis for patients with acute proximal deep vein thrombosis of the leg. However, it is a grade 2C (weak) recommendation.
They provide no specific recommendation as to the clinical indications for catheter-directed thrombolysis, but identify patients who would be most likely to benefit, ie, those who have:
- Iliofemoral deep vein thrombosis
- Symptoms for less than 14 days
- Good functional status
- Life expectancy of more than 1 year
- Low risk of bleeding.
Our patient satisfies these criteria, suggesting that catheter-directed thrombolysis is a reasonable option for him.
Timing is important. Catheter-directed lysis is more likely to be beneficial if used before fibrin deposits form and stiffen the venous valves, causing irreversible damage that leads to postthrombotic syndrome.
Role of direct oral anticoagulants
The availability of direct oral anticoagulants has generated interest in defining their therapeutic role in patients with venous thromboembolism.
In a meta-analysis5 of major trials comparing direct oral anticoagulants and vitamin K antagonists such as warfarin, no significant difference was found for the risk of recurrent venous thromboembolism or venous thromboembolism-related deaths. However, fewer patients experienced major bleeding with direct oral anticoagulants (relative risk 0.61, P = .002). Although significant, the absolute risk reduction was small; the incidence of major bleeding was 1.1% with direct oral anticoagulants vs 1.8% with vitamin K antagonists.
The main advantage of direct oral anticoagulants is greater convenience for the patient.
WHICH PATIENTS ON WARFARIN NEED BRIDGING PREOPERATIVELY?
Many patients still take warfarin, particularly those with atrial fibrillation, a mechanical heart valve, or venous thromboembolism. In many countries, warfarin remains the dominant anticoagulant for stroke prevention. Whether these patients need heparin during the period of perioperative warfarin interruption is a frequently encountered scenario that, until recently, was controversial. Recent studies have helped to inform the need for heparin bridging in many of these patients.
Case 2: An elderly woman on warfarin facing cancer surgery
A 75-year-old woman weighing 65 kg is scheduled for elective colon resection for incidentally found colon cancer. She is taking warfarin for atrial fibrillation. She also has hypertension and diabetes and had a transient ischemic attack 10 years ago.
One doctor told her she needs to be assessed for heparin bridging, but another told her she does not need bridging.
The default management should be not to bridge patients who have atrial fibrillation, but to consider bridging in selected patients, such as those with recent stroke or transient ischemic attack or a prior thromboembolic event during warfarin interruption. However, decisions about bridging should not be made on the basis of the CHADS2 score alone. For the patient described here, I would recommend not bridging.
Complex factors contribute to stroke risk
Stroke risk for patients with atrial fibrillation can be quickly estimated with the CHADS2 score, based on:
- Congestive heart failure (1 point)
- Hypertension (1 point)
- Age at least 75 (1 point)
- Diabetes (1 point)
- Stroke or transient ischemic attack (2 points).
Our patient has a score of 5, corresponding to an annual adjusted stroke risk of 12.5%. Whether her transient ischemic attack of 10 years ago is comparable in significance to a recent stroke is debatable and highlights a weakness of clinical prediction rules. Moreover, such prediction scores were developed to estimate the long-term risk of stroke if anticoagulants are not given, and they have not been assessed in a perioperative setting where there is short-term interruption of anticoagulants. Also, the perioperative milieu is associated with additional factors not captured in these clinical prediction rules that may affect the risk of stroke.
Thus, the risk of perioperative stroke likely involves the interplay of multiple factors, including the type of surgery the patient is undergoing. Some factors may be mitigated:
- Rebound hypercoagulability after stopping an oral anticoagulant can be prevented by intraoperative blood pressure and volume control
- Elevated biochemical factors (eg, D-dimer, B-type natriuretic peptide, troponin) may be lowered with perioperative aspirin therapy
- Lipid and genetic factors may be mitigated with perioperative statin use.
Can heparin bridging also mitigate the risk?
Bridging in patients with atrial fibrillation
Most patients who are taking warfarin are doing so because of atrial fibrillation, so most evidence about perioperative bridging was developed in such patients.
The BRIDGE trial (Bridging Anticoagulation in Patients Who Require Temporary Interruption of Warfarin Therapy for an Elective Invasive Procedure or Surgery)6 was the first randomized controlled trial to compare a bridging and no-bridging strategy for patients with atrial fibrillation who required warfarin interruption for elective surgery. Nearly 2,000 patients were given either low-molecular-weight heparin or placebo starting 3 days before until 24 hours before a procedure, and then for 5 to 10 days afterwards. For all patients, warfarin was stopped 5 days before the procedure and was resumed within 24 hours afterwards.
A no-bridging strategy was noninferior to bridging: the risk of perioperative arterial thromboembolism was 0.4% without bridging vs 0.3% with bridging (P = .01 for noninferiority). In addition, a no-bridging strategy conferred a lower risk of major bleeding than bridging: 1.3% vs 3.2% (relative risk 0.41, P = .005 for superiority).
Although the difference in absolute bleeding risk was small, bleeding rates were lower than those seen outside of clinical trials, as the bridging protocol used in BRIDGE was designed to minimize the risk of bleeding. Also, although only 5% of patients had a CHADS2 score of 5 or 6, such patients are infrequent in clinical practice, and BRIDGE did include a considerable proportion (17%) of patients with a prior stroke or transient ischemic attack who would be considered at high risk.
Other evidence about heparin bridging is derived from observational studies, more than 10 of which have been conducted. In general, they have found that not bridging is associated with low rates of arterial thromboembolism (< 0.5%) and that bridging is associated with high rates of major bleeding (4%–7%).7–12
Bridging in patients with a mechanical heart valve
Warfarin is the only anticoagulant option for patients who have a mechanical heart valve. No randomized controlled trials have evaluated the benefits of perioperative bridging vs no bridging in this setting.
Observational (cohort) studies suggest that the risk of perioperative arterial thromboembolism is similar with or without bridging anticoagulation, although most patients studied were bridged and those not bridged were considered at low risk (eg, with a bileaflet aortic valve and no additional risk factors).13 However, without stronger evidence from randomized controlled trials, bridging should be the default management for patients with a mechanical heart valve. In our practice, we bridge most patients who have a mechanical heart valve unless they are considered to be at low risk, such as those who have a bileaflet aortic valve.
Bridging in patients with prior venous thromboembolism
Even less evidence is available for periprocedural management of patients who have a history of venous thromboembolism. No randomized controlled trials exist evaluating bridging vs no bridging. In 1 cohort study in which more than 90% of patients had had thromboembolism more than 3 months before the procedure, the rate of recurrent venous thromboembolism without bridging was less than 0.5%.14
It is reasonable to bridge patients who need anticoagulant interruption within 3 months of diagnosis of a deep vein thrombosis or pulmonary embolism, and to consider using a temporary inferior vena cava filter for patients who have had a clot who need treatment interruption during the initial 3 to 4 weeks after diagnosis.
Practice guidelines: Perioperative anticoagulation
Guidance for preoperative and postoperative bridging for patients taking warfarin is summarized in Table 2.
CARDIAC PROCEDURES
For patients facing a procedure to implant an implantable cardioverter-defibrillator (ICD) or pacemaker, a procedure-specific concern is the avoidance of pocket hematoma.
Patients on warfarin: Do not bridge
The BRUISE CONTROL-1 trial (Bridge or Continue Coumadin for Device Surgery Randomized Controlled Trial)19 randomized patients undergoing pacemaker or ICD implantation to either continued anticoagulation therapy and not bridging (ie, continued warfarin so long as the international normalized ratio was < 3) vs conventional bridging treatment (ie, stopping warfarin and bridging with low-molecular-weight heparin). A clinically significant device-pocket hematoma occurred in 3.5% of the continued-warfarin group vs 16.0% in the heparin-bridging group (P < .001). Thromboembolic complications were rare, and rates did not differ between the 2 groups.
Results of the BRUISE CONTROL-1 trial serve as a caution to at least not be too aggressive with bridging. The study design involved resuming heparin 24 hours after surgery, which is perhaps more aggressive than standard practice. In our practice, we wait at least 24 hours to reinstate heparin after minor surgery, and 48 to 72 hours after surgery with higher bleeding risk.
These results are perhaps not surprising if one considers how carefully surgeons try to control bleeding during surgery for patients taking anticoagulants. For patients who are not on an anticoagulant, small bleeding may be less of a concern during a procedure. When high doses of heparin are introduced soon after surgery, small concerns during surgery may become big problems afterward.
Based on these results, it is reasonable to undertake device implantation without interruption of a vitamin K antagonist such as warfarin.
Patients on direct oral anticoagulants: The jury is still out
The similar BRUISE CONTROL-2 trial is currently under way, comparing interruption vs continuation of dabigatran for patients undergoing cardiac device surgery.
In Europe, surgeons are less concerned than those in the United States about operating while a patient is on anticoagulant therapy. But the safety of this practice is not backed by strong evidence.
Direct oral anticoagulants: Consider pharmacokinetics
Direct oral anticoagulants are potent and fast-acting, with a peak effect 1 to 3 hours after intake. This rapid anticoagulant action is similar to that of bridging with low-molecular-weight heparin, and caution is needed when administering direct oral anticoagulants, especially after major surgery or surgery with a high bleeding risk.
Frost et al20 compared the pharmacokinetics of apixaban (with twice-daily dosing) and rivaroxaban (once-daily dosing) and found that peak anticoagulant activity is faster and higher with rivaroxaban. This is important, because many patients will take their anticoagulant first thing in the morning. Consequently, if patients require any kind of procedure (including dental), they should skip the morning dose of the direct oral anticoagulant to avoid having the procedure done during the peak anticoagulant effect, and they should either not take that day’s dose or defer the dose until the evening after the procedure.
MANAGING SURGERY FOR PATIENTS ON A DIRECT ORAL ANTICOAGULANT
Case 3: An elderly woman on apixaban facing surgery
Let us imagine that our previous patient takes apixaban instead of warfarin. She is 75 years old, has atrial fibrillation, and is about to undergo elective colon resection for cancer. One doctor advises her to simply stop apixaban for 2 days, while another says she should go off apixaban for 5 days and will need bridging. Which plan is best?
In the perioperative setting, our goal is to interrupt patients’ anticoagulant therapy for the shortest time that results in no residual anticoagulant effect at the time of the procedure.
They further recommend that if the risk of venous thromboembolism is high, low-molecular-weight heparin bridging should be done while stopping the direct oral anticoagulant, with the heparin discontinued 24 hours before the procedure. This recommendation seems counterintuitive, as it is advising replacing a short-acting anticoagulant with low-molecular-weight heparin, another short-acting anticoagulant.
The guidelines committee was unable to provide strength and grading of their recommendations, as too few well-designed studies are available to support them. The doctor in case 3 who advised stopping apixaban for 5 days and bridging is following the guidelines, but without much evidence to support this strategy.
Is bridging needed during interruption of a direct oral anticoagulant?
There are no randomized, controlled trials of bridging vs no bridging in patients taking direct oral anticoagulants. Substudies exist of patients taking these drugs for atrial fibrillation who had treatment interrupted for procedures, but the studies did not randomize bridging vs no bridging, nor were bridging regimens standardized. Three of the four atrial fibrillation trials had a blinded design (warfarin vs direct oral anticoagulants), making perioperative management difficult, as physicians did not know the pharmacokinetics of the drugs their patients were taking.22–24
We used the database from the Randomized Evaluation of Long-Term Anticoagulation Therapy (RE-LY) trial22 to evaluate bridging in patients taking either warfarin or dabigatran. With an open-label study design (the blinding was only for the 110 mg and 150 mg dabigatran doses), clinicians were aware of whether patients were receiving warfarin or dabigatran, thereby facilitating perioperative management. Among dabigatran-treated patients, those who were bridged had significantly more major bleeding than those not bridged (6.5% vs 1.8%, P < .001), with no difference between the groups for stroke or systemic embolism. Although it is not a randomized controlled trial, it does provide evidence that bridging may not be advisable for patients taking a direct oral anticoagulant.
The 2017 American College of Cardiology guidelines25 conclude that parenteral bridging is not indicated for direct oral anticoagulants. Although this is not based on strong evidence, the guidance appears reasonable according to the evidence at hand.
The 2017 American Heart Association Guidelines16 recommend a somewhat complex approach based on periprocedural bleeding risk and thromboembolic risk.
How long to interrupt direct oral anticoagulants?
Evidence for this approach comes from a prospective cohort study27 of 541 patients being treated with dabigatran who were having an elective surgery or invasive procedure. Patients received standard perioperative management, with the timing of the last dabigatran dose before the procedure (24 hours, 48 hours, or 96 hours) based on the bleeding risk of surgery and the patient’s creatinine clearance. Dabigatran was resumed 24 to 72 hours after the procedure. No heparin bridging was done. Patients were followed for up to 30 days postoperatively. The results were favorable with few complications: one transient ischemic attack (0.2%), 10 major bleeding episodes (1.8%), and 28 minor bleeding episodes (5.2%).
A subgroup of 181 patients in this study28 had a plasma sample drawn just before surgery, allowing the investigators to assess the level of coagulation factors after dabigatran interruption. Results were as follows:
- 93% had a normal prothrombin time
- 80% had a normal activated partial thromboplastin time
- 33% had a normal thrombin time
- 81% had a normal dilute thrombin time.
The dilute thrombin time is considered the most reliable test of the anticoagulant effect of dabigatran but is not widely available. The activated partial thromboplastin time can provide a more widely used coagulation test to assess (in a less precise manner) whether there is an anticoagulant effect of dabigatran present, and more sensitive activated partial thromboplastin time assays can be used to better detect any residual dabigatran effect.
Dabigatran levels were also measured. Although 66% of patients had low drug levels just before surgery, the others still had substantial dabigatran on board. The fact that bleeding event rates were so low in this study despite the presence of dabigatran in many patients raises the question of whether having some drug on board is a good predictor of bleeding risk.
An interruption protocol with a longer interruption interval—12 to 14 hours longer than in the previous study (3 days for high-bleed risk procedures, 2 days for low-bleed risk procedures)—brought the activated partial thromboplastin time and dilute thrombin time to normal levels for 100% of patients with the protocol for high-bleeding-risk surgery. This study was based on small numbers and its interruption strategy needs further investigation.29
Case 3 continued
The PAUSE study (NCT02228798), a multicenter, prospective cohort study, is designed to establish a safe, standardized protocol for the perioperative management of patients with atrial fibrillation taking dabigatran, rivaroxaban, or apixaban and will include 3,300 patients.
PATIENTS WITH A CORONARY STENT WHO NEED SURGERY
Case 4: A woman with a stent facing surgery
A 70-year-old woman needs breast cancer resection. She has coronary artery disease and had a drug-eluting stent placed 5 months ago after elective cardiac catheterization. She also has hypertension, obesity, and type 2 diabetes. Her medications include an angiotensin II receptor blocker, hydrochlorothiazide, insulin, and an oral hypoglycemic. She is also taking aspirin 81 mg daily and ticagrelor (a P2Y12 receptor antagonist) 90 mg twice daily.
Her cardiologist is concerned that stopping antiplatelet therapy could trigger acute stent thrombosis, which has a 50% or higher mortality rate.
Should she stop taking aspirin before surgery? What about the ticagrelor?
Is aspirin safe during surgery?
Evidence concerning aspirin during surgery comes from Perioperative Ischemic Evaluation 2 (POISE-2), a double-blind, randomized controlled trial.30 Patients who had known cardiovascular disease or risk factors for cardiovascular disease and were about to undergo noncardiac surgery were stratified according to whether they had been taking aspirin before the study (patients taking aspirin within 72 hours of the surgery were excluded from randomization). Participants in each group were randomized to take either aspirin or placebo just before surgery. The primary outcome was the combined rate of death or nonfatal myocardial infarction 30 days after randomization.
The study found no differences in the primary end point between the two groups. However, major bleeding occurred significantly more often in the aspirin group (4.6% vs 3.8%, hazard ratio 1.2, 95% confidence interval 1.0–1.5).
Moreover, only 4% of the patients in this trial had a cardiac stent. The trial excluded patients who had had a bare-metal stent placed within 6 weeks or a drug-eluting stent placed within 1 year, so it does not help us answer whether aspirin should be stopped for our current patient.
Is surgery safe for patients with stents?
The safety of undergoing surgery with a stent was investigated in a large US Veterans Administration retrospective cohort study.31 More than 20,000 patients with stents who underwent noncardiac surgery within 2 years of stent placement were compared with a control group of more than 41,000 patients with stents who did not undergo surgery. Patients were matched by stent type and cardiac risk factors at the time of stent placement.
The risk of an adverse cardiac event in both the surgical and nonsurgical cohorts was highest in the initial 6 weeks after stent placement and plateaued 6 months after stent placement, when the risk difference between the surgical and nonsurgical groups leveled off to 1%.
The risk of a major adverse cardiac event postoperatively was much more dependent on the timing of stent placement in complex and inpatient surgeries. For outpatient surgeries, the risk of a major cardiac event was very low and the timing of stent placement did not matter.
A Danish observational study32 compared more than 4,000 patients with drug-eluting stents having surgery to more than 20,000 matched controls without coronary heart disease having similar surgery. The risk of myocardial infarction or cardiac death was much higher for patients undergoing surgery within 1 month after drug-eluting stent placement compared with controls without heart disease and patients with stent placement longer than 1 month before surgery.
Our practice is to continue aspirin for surgery in patients with coronary stents regardless of the timing of placement. Although there is a small increased risk of bleeding, this must be balanced against thrombotic risk. We typically stop clopidogrel 5 to 7 days before surgery and ticagrelor 3 to 5 days before surgery. We may decide to give platelets before very-high-risk surgery (eg, intracranial, spinal) if there is a decision to continue both antiplatelet drugs—for example, in a patient who recently received a drug-eluting stent (ie, within 3 months). It is essential to involve the cardiologist and surgeon in these decisions.
BOTTOM LINE
This article reviews recommendations and evidence concerning current anticoagulant management for venous thromboembolism and perioperative care, with an emphasis on individualizing treatment for real-world patients.
TREATING ACUTE VENOUS THROMBOEMBOLISM
Case 1: Deep vein thrombosis in an otherwise healthy man
A 40-year-old man presents with 7 days of progressive right leg swelling. He has no antecedent risk factors for deep vein thrombosis or other medical problems. Venous ultrasonography reveals an iliofemoral deep vein thrombosis. How should he be managed?
- Outpatient treatment with low-molecular-weight heparin for 4 to 6 days plus warfarin
- Outpatient treatment with a direct oral anticoagulant, ie, apixaban, dabigatran (which requires 4 to 6 days of initial treatment with low-molecular-weight heparin), or rivaroxaban
- Catheter-directed thrombolysis followed by low-molecular-weight heparin, then warfarin or a direct oral anticoagulant
- Inpatient intravenous heparin for 7 to 10 days, then warfarin or a direct oral anticoagulant
All of these are acceptable for managing acute venous thromboembolism, but the clinician’s role is to identify which treatment is most appropriate for an individual patient.
Deep vein thrombosis is not a single condition
Multiple guidelines exist to help decide on a management strategy. Those of the American College of Chest Physicians (ACCP)1 are used most often.
That said, guidelines are established for “average” patients, so it is important to look beyond guidelines and individualize management. Venous thromboembolism is not a single entity; it has a myriad of clinical presentations that could call for different treatments. Most patients have submassive deep vein thrombosis or pulmonary embolism, which is not limb-threatening nor associated with hemodynamic instability. It can also differ in terms of etiology and can be unprovoked (or idiopathic), cancer-related, catheter-associated, or provoked by surgery or immobility.
Deep vein thrombosis has a wide spectrum of presentations. It can involve the veins of the calf only, or it can involve the femoral and iliac veins and other locations including the splanchnic veins, the cerebral sinuses, and upper extremities. Pulmonary embolism can be massive (defined as being associated with hemodynamic instability or impending respiratory failure) or submassive. Similarly, patients differ in terms of baseline medical conditions, mobility, and lifestyle. Anticoagulant management decisions should take all these factors into account.
Consider clot location
Our patient with iliofemoral deep vein thrombosis is best managed differently than a more typical patient with less extensive thrombosis that would involve the popliteal or femoral vein segments, or both. A clot that involves the iliac vein is more likely to lead to postthrombotic chronic pain and swelling as the lack of venous outflow bypass channels to circumvent the clot location creates higher venous pressure within the affected leg. Therefore, for our patient, catheter-directed thrombolysis is an option that should be considered.
Catheter-directed thrombolysis trials
According to the “open-vein hypothesis,” quickly eliminating the thrombus and restoring unobstructed venous flow may mitigate the risk not only of recurrent thrombosis, but also of postthrombotic syndrome, which is often not given much consideration acutely but can cause significant, life-altering chronic disability.
The “valve-integrity hypothesis” is also important; it considers whether lytic therapy may help prevent damage to such valves in an attempt to mitigate the amount of venous hypertension.
Thus, catheter-directed thrombolysis offers theoretical benefits, and recent trials have assessed it against standard anticoagulation treatments.
The CaVenT trial (Catheter-Directed Venous Thrombolysis),2 conducted in Norway, randomized 209 patients with midfemoral to iliac deep vein thrombosis to conventional treatment (anticoagulation alone) or anticoagulation plus catheter-directed thrombolysis. At 2 years, postthrombotic syndrome had occurred in 41% of the catheter-directed thrombolysis group compared with 56% of the conventional treatment group (P = .047). At 5 years, the difference widened to 43% vs 71% (P < .01, number needed to treat = 4).3 Despite the superiority of lytic therapy, the incidence of postthrombotic syndrome remained high in patients who received this treatment.
The ATTRACT trial (Acute Venous Thrombosis: Thrombus Removal With Adjunctive Catheter-Directed Thrombolysis),4 a US multicenter, open-label, assessor-blind study, randomized 698 patients with femoral or more-proximal deep vein thrombosis to either standard care (anticoagulant therapy and graduated elastic compression stockings) or standard care plus catheter-directed thrombolysis. In preliminary results presented at the Society of Interventional Radiology meeting in March 2017, although no difference was found in the primary outcome (postthrombotic syndrome at 24 months), catheter-directed thrombolysis for iliofemoral deep vein thrombosis led to a 25% reduction in moderate to severe postthrombotic syndrome.
Although it is too early to draw conclusions before publication of the ATTRACT study, the preliminary results highlight the need to individualize treatment and to be selective about using catheter-directed thrombolysis. The trials provide reassurance that catheter-directed lysis is a reasonable and safe intervention when performed by physicians experienced in the procedure. The risk of major bleeding appears to be low (about 2%) and that for intracranial hemorrhage even lower (< 0.5%).
Catheter-directed thrombolysis is appropriate in some cases
The 2016 ACCP guidelines1 recommend anticoagulant therapy alone over catheter-directed thrombolysis for patients with acute proximal deep vein thrombosis of the leg. However, it is a grade 2C (weak) recommendation.
They provide no specific recommendation as to the clinical indications for catheter-directed thrombolysis, but identify patients who would be most likely to benefit, ie, those who have:
- Iliofemoral deep vein thrombosis
- Symptoms for less than 14 days
- Good functional status
- Life expectancy of more than 1 year
- Low risk of bleeding.
Our patient satisfies these criteria, suggesting that catheter-directed thrombolysis is a reasonable option for him.
Timing is important. Catheter-directed lysis is more likely to be beneficial if used before fibrin deposits form and stiffen the venous valves, causing irreversible damage that leads to postthrombotic syndrome.
Role of direct oral anticoagulants
The availability of direct oral anticoagulants has generated interest in defining their therapeutic role in patients with venous thromboembolism.
In a meta-analysis5 of major trials comparing direct oral anticoagulants and vitamin K antagonists such as warfarin, no significant difference was found for the risk of recurrent venous thromboembolism or venous thromboembolism-related deaths. However, fewer patients experienced major bleeding with direct oral anticoagulants (relative risk 0.61, P = .002). Although significant, the absolute risk reduction was small; the incidence of major bleeding was 1.1% with direct oral anticoagulants vs 1.8% with vitamin K antagonists.
The main advantage of direct oral anticoagulants is greater convenience for the patient.
WHICH PATIENTS ON WARFARIN NEED BRIDGING PREOPERATIVELY?
Many patients still take warfarin, particularly those with atrial fibrillation, a mechanical heart valve, or venous thromboembolism. In many countries, warfarin remains the dominant anticoagulant for stroke prevention. Whether these patients need heparin during the period of perioperative warfarin interruption is a frequently encountered scenario that, until recently, was controversial. Recent studies have helped to inform the need for heparin bridging in many of these patients.
Case 2: An elderly woman on warfarin facing cancer surgery
A 75-year-old woman weighing 65 kg is scheduled for elective colon resection for incidentally found colon cancer. She is taking warfarin for atrial fibrillation. She also has hypertension and diabetes and had a transient ischemic attack 10 years ago.
One doctor told her she needs to be assessed for heparin bridging, but another told her she does not need bridging.
The default management should be not to bridge patients who have atrial fibrillation, but to consider bridging in selected patients, such as those with recent stroke or transient ischemic attack or a prior thromboembolic event during warfarin interruption. However, decisions about bridging should not be made on the basis of the CHADS2 score alone. For the patient described here, I would recommend not bridging.
Complex factors contribute to stroke risk
Stroke risk for patients with atrial fibrillation can be quickly estimated with the CHADS2 score, based on:
- Congestive heart failure (1 point)
- Hypertension (1 point)
- Age at least 75 (1 point)
- Diabetes (1 point)
- Stroke or transient ischemic attack (2 points).
Our patient has a score of 5, corresponding to an annual adjusted stroke risk of 12.5%. Whether her transient ischemic attack of 10 years ago is comparable in significance to a recent stroke is debatable and highlights a weakness of clinical prediction rules. Moreover, such prediction scores were developed to estimate the long-term risk of stroke if anticoagulants are not given, and they have not been assessed in a perioperative setting where there is short-term interruption of anticoagulants. Also, the perioperative milieu is associated with additional factors not captured in these clinical prediction rules that may affect the risk of stroke.
Thus, the risk of perioperative stroke likely involves the interplay of multiple factors, including the type of surgery the patient is undergoing. Some factors may be mitigated:
- Rebound hypercoagulability after stopping an oral anticoagulant can be prevented by intraoperative blood pressure and volume control
- Elevated biochemical factors (eg, D-dimer, B-type natriuretic peptide, troponin) may be lowered with perioperative aspirin therapy
- Lipid and genetic factors may be mitigated with perioperative statin use.
Can heparin bridging also mitigate the risk?
Bridging in patients with atrial fibrillation
Most patients who are taking warfarin are doing so because of atrial fibrillation, so most evidence about perioperative bridging was developed in such patients.
The BRIDGE trial (Bridging Anticoagulation in Patients Who Require Temporary Interruption of Warfarin Therapy for an Elective Invasive Procedure or Surgery)6 was the first randomized controlled trial to compare a bridging and no-bridging strategy for patients with atrial fibrillation who required warfarin interruption for elective surgery. Nearly 2,000 patients were given either low-molecular-weight heparin or placebo starting 3 days before until 24 hours before a procedure, and then for 5 to 10 days afterwards. For all patients, warfarin was stopped 5 days before the procedure and was resumed within 24 hours afterwards.
A no-bridging strategy was noninferior to bridging: the risk of perioperative arterial thromboembolism was 0.4% without bridging vs 0.3% with bridging (P = .01 for noninferiority). In addition, a no-bridging strategy conferred a lower risk of major bleeding than bridging: 1.3% vs 3.2% (relative risk 0.41, P = .005 for superiority).
Although the difference in absolute bleeding risk was small, bleeding rates were lower than those seen outside of clinical trials, as the bridging protocol used in BRIDGE was designed to minimize the risk of bleeding. Also, although only 5% of patients had a CHADS2 score of 5 or 6, such patients are infrequent in clinical practice, and BRIDGE did include a considerable proportion (17%) of patients with a prior stroke or transient ischemic attack who would be considered at high risk.
Other evidence about heparin bridging is derived from observational studies, more than 10 of which have been conducted. In general, they have found that not bridging is associated with low rates of arterial thromboembolism (< 0.5%) and that bridging is associated with high rates of major bleeding (4%–7%).7–12
Bridging in patients with a mechanical heart valve
Warfarin is the only anticoagulant option for patients who have a mechanical heart valve. No randomized controlled trials have evaluated the benefits of perioperative bridging vs no bridging in this setting.
Observational (cohort) studies suggest that the risk of perioperative arterial thromboembolism is similar with or without bridging anticoagulation, although most patients studied were bridged and those not bridged were considered at low risk (eg, with a bileaflet aortic valve and no additional risk factors).13 However, without stronger evidence from randomized controlled trials, bridging should be the default management for patients with a mechanical heart valve. In our practice, we bridge most patients who have a mechanical heart valve unless they are considered to be at low risk, such as those who have a bileaflet aortic valve.
Bridging in patients with prior venous thromboembolism
Even less evidence is available for periprocedural management of patients who have a history of venous thromboembolism. No randomized controlled trials exist evaluating bridging vs no bridging. In 1 cohort study in which more than 90% of patients had had thromboembolism more than 3 months before the procedure, the rate of recurrent venous thromboembolism without bridging was less than 0.5%.14
It is reasonable to bridge patients who need anticoagulant interruption within 3 months of diagnosis of a deep vein thrombosis or pulmonary embolism, and to consider using a temporary inferior vena cava filter for patients who have had a clot who need treatment interruption during the initial 3 to 4 weeks after diagnosis.
Practice guidelines: Perioperative anticoagulation
Guidance for preoperative and postoperative bridging for patients taking warfarin is summarized in Table 2.
CARDIAC PROCEDURES
For patients facing a procedure to implant an implantable cardioverter-defibrillator (ICD) or pacemaker, a procedure-specific concern is the avoidance of pocket hematoma.
Patients on warfarin: Do not bridge
The BRUISE CONTROL-1 trial (Bridge or Continue Coumadin for Device Surgery Randomized Controlled Trial)19 randomized patients undergoing pacemaker or ICD implantation to either continued anticoagulation therapy and not bridging (ie, continued warfarin so long as the international normalized ratio was < 3) vs conventional bridging treatment (ie, stopping warfarin and bridging with low-molecular-weight heparin). A clinically significant device-pocket hematoma occurred in 3.5% of the continued-warfarin group vs 16.0% in the heparin-bridging group (P < .001). Thromboembolic complications were rare, and rates did not differ between the 2 groups.
Results of the BRUISE CONTROL-1 trial serve as a caution to at least not be too aggressive with bridging. The study design involved resuming heparin 24 hours after surgery, which is perhaps more aggressive than standard practice. In our practice, we wait at least 24 hours to reinstate heparin after minor surgery, and 48 to 72 hours after surgery with higher bleeding risk.
These results are perhaps not surprising if one considers how carefully surgeons try to control bleeding during surgery for patients taking anticoagulants. For patients who are not on an anticoagulant, small bleeding may be less of a concern during a procedure. When high doses of heparin are introduced soon after surgery, small concerns during surgery may become big problems afterward.
Based on these results, it is reasonable to undertake device implantation without interruption of a vitamin K antagonist such as warfarin.
Patients on direct oral anticoagulants: The jury is still out
The similar BRUISE CONTROL-2 trial is currently under way, comparing interruption vs continuation of dabigatran for patients undergoing cardiac device surgery.
In Europe, surgeons are less concerned than those in the United States about operating while a patient is on anticoagulant therapy. But the safety of this practice is not backed by strong evidence.
Direct oral anticoagulants: Consider pharmacokinetics
Direct oral anticoagulants are potent and fast-acting, with a peak effect 1 to 3 hours after intake. This rapid anticoagulant action is similar to that of bridging with low-molecular-weight heparin, and caution is needed when administering direct oral anticoagulants, especially after major surgery or surgery with a high bleeding risk.
Frost et al20 compared the pharmacokinetics of apixaban (with twice-daily dosing) and rivaroxaban (once-daily dosing) and found that peak anticoagulant activity is faster and higher with rivaroxaban. This is important, because many patients will take their anticoagulant first thing in the morning. Consequently, if patients require any kind of procedure (including dental), they should skip the morning dose of the direct oral anticoagulant to avoid having the procedure done during the peak anticoagulant effect, and they should either not take that day’s dose or defer the dose until the evening after the procedure.
MANAGING SURGERY FOR PATIENTS ON A DIRECT ORAL ANTICOAGULANT
Case 3: An elderly woman on apixaban facing surgery
Let us imagine that our previous patient takes apixaban instead of warfarin. She is 75 years old, has atrial fibrillation, and is about to undergo elective colon resection for cancer. One doctor advises her to simply stop apixaban for 2 days, while another says she should go off apixaban for 5 days and will need bridging. Which plan is best?
In the perioperative setting, our goal is to interrupt patients’ anticoagulant therapy for the shortest time that results in no residual anticoagulant effect at the time of the procedure.
They further recommend that if the risk of venous thromboembolism is high, low-molecular-weight heparin bridging should be done while stopping the direct oral anticoagulant, with the heparin discontinued 24 hours before the procedure. This recommendation seems counterintuitive, as it is advising replacing a short-acting anticoagulant with low-molecular-weight heparin, another short-acting anticoagulant.
The guidelines committee was unable to provide strength and grading of their recommendations, as too few well-designed studies are available to support them. The doctor in case 3 who advised stopping apixaban for 5 days and bridging is following the guidelines, but without much evidence to support this strategy.
Is bridging needed during interruption of a direct oral anticoagulant?
There are no randomized, controlled trials of bridging vs no bridging in patients taking direct oral anticoagulants. Substudies exist of patients taking these drugs for atrial fibrillation who had treatment interrupted for procedures, but the studies did not randomize bridging vs no bridging, nor were bridging regimens standardized. Three of the four atrial fibrillation trials had a blinded design (warfarin vs direct oral anticoagulants), making perioperative management difficult, as physicians did not know the pharmacokinetics of the drugs their patients were taking.22–24
We used the database from the Randomized Evaluation of Long-Term Anticoagulation Therapy (RE-LY) trial22 to evaluate bridging in patients taking either warfarin or dabigatran. With an open-label study design (the blinding was only for the 110 mg and 150 mg dabigatran doses), clinicians were aware of whether patients were receiving warfarin or dabigatran, thereby facilitating perioperative management. Among dabigatran-treated patients, those who were bridged had significantly more major bleeding than those not bridged (6.5% vs 1.8%, P < .001), with no difference between the groups for stroke or systemic embolism. Although it is not a randomized controlled trial, it does provide evidence that bridging may not be advisable for patients taking a direct oral anticoagulant.
The 2017 American College of Cardiology guidelines25 conclude that parenteral bridging is not indicated for direct oral anticoagulants. Although this is not based on strong evidence, the guidance appears reasonable according to the evidence at hand.
The 2017 American Heart Association Guidelines16 recommend a somewhat complex approach based on periprocedural bleeding risk and thromboembolic risk.
How long to interrupt direct oral anticoagulants?
Evidence for this approach comes from a prospective cohort study27 of 541 patients being treated with dabigatran who were having an elective surgery or invasive procedure. Patients received standard perioperative management, with the timing of the last dabigatran dose before the procedure (24 hours, 48 hours, or 96 hours) based on the bleeding risk of surgery and the patient’s creatinine clearance. Dabigatran was resumed 24 to 72 hours after the procedure. No heparin bridging was done. Patients were followed for up to 30 days postoperatively. The results were favorable with few complications: one transient ischemic attack (0.2%), 10 major bleeding episodes (1.8%), and 28 minor bleeding episodes (5.2%).
A subgroup of 181 patients in this study28 had a plasma sample drawn just before surgery, allowing the investigators to assess the level of coagulation factors after dabigatran interruption. Results were as follows:
- 93% had a normal prothrombin time
- 80% had a normal activated partial thromboplastin time
- 33% had a normal thrombin time
- 81% had a normal dilute thrombin time.
The dilute thrombin time is considered the most reliable test of the anticoagulant effect of dabigatran but is not widely available. The activated partial thromboplastin time can provide a more widely used coagulation test to assess (in a less precise manner) whether there is an anticoagulant effect of dabigatran present, and more sensitive activated partial thromboplastin time assays can be used to better detect any residual dabigatran effect.
Dabigatran levels were also measured. Although 66% of patients had low drug levels just before surgery, the others still had substantial dabigatran on board. The fact that bleeding event rates were so low in this study despite the presence of dabigatran in many patients raises the question of whether having some drug on board is a good predictor of bleeding risk.
An interruption protocol with a longer interruption interval—12 to 14 hours longer than in the previous study (3 days for high-bleed risk procedures, 2 days for low-bleed risk procedures)—brought the activated partial thromboplastin time and dilute thrombin time to normal levels for 100% of patients with the protocol for high-bleeding-risk surgery. This study was based on small numbers and its interruption strategy needs further investigation.29
Case 3 continued
The PAUSE study (NCT02228798), a multicenter, prospective cohort study, is designed to establish a safe, standardized protocol for the perioperative management of patients with atrial fibrillation taking dabigatran, rivaroxaban, or apixaban and will include 3,300 patients.
PATIENTS WITH A CORONARY STENT WHO NEED SURGERY
Case 4: A woman with a stent facing surgery
A 70-year-old woman needs breast cancer resection. She has coronary artery disease and had a drug-eluting stent placed 5 months ago after elective cardiac catheterization. She also has hypertension, obesity, and type 2 diabetes. Her medications include an angiotensin II receptor blocker, hydrochlorothiazide, insulin, and an oral hypoglycemic. She is also taking aspirin 81 mg daily and ticagrelor (a P2Y12 receptor antagonist) 90 mg twice daily.
Her cardiologist is concerned that stopping antiplatelet therapy could trigger acute stent thrombosis, which has a 50% or higher mortality rate.
Should she stop taking aspirin before surgery? What about the ticagrelor?
Is aspirin safe during surgery?
Evidence concerning aspirin during surgery comes from Perioperative Ischemic Evaluation 2 (POISE-2), a double-blind, randomized controlled trial.30 Patients who had known cardiovascular disease or risk factors for cardiovascular disease and were about to undergo noncardiac surgery were stratified according to whether they had been taking aspirin before the study (patients taking aspirin within 72 hours of the surgery were excluded from randomization). Participants in each group were randomized to take either aspirin or placebo just before surgery. The primary outcome was the combined rate of death or nonfatal myocardial infarction 30 days after randomization.
The study found no differences in the primary end point between the two groups. However, major bleeding occurred significantly more often in the aspirin group (4.6% vs 3.8%, hazard ratio 1.2, 95% confidence interval 1.0–1.5).
Moreover, only 4% of the patients in this trial had a cardiac stent. The trial excluded patients who had had a bare-metal stent placed within 6 weeks or a drug-eluting stent placed within 1 year, so it does not help us answer whether aspirin should be stopped for our current patient.
Is surgery safe for patients with stents?
The safety of undergoing surgery with a stent was investigated in a large US Veterans Administration retrospective cohort study.31 More than 20,000 patients with stents who underwent noncardiac surgery within 2 years of stent placement were compared with a control group of more than 41,000 patients with stents who did not undergo surgery. Patients were matched by stent type and cardiac risk factors at the time of stent placement.
The risk of an adverse cardiac event in both the surgical and nonsurgical cohorts was highest in the initial 6 weeks after stent placement and plateaued 6 months after stent placement, when the risk difference between the surgical and nonsurgical groups leveled off to 1%.
The risk of a major adverse cardiac event postoperatively was much more dependent on the timing of stent placement in complex and inpatient surgeries. For outpatient surgeries, the risk of a major cardiac event was very low and the timing of stent placement did not matter.
A Danish observational study32 compared more than 4,000 patients with drug-eluting stents having surgery to more than 20,000 matched controls without coronary heart disease having similar surgery. The risk of myocardial infarction or cardiac death was much higher for patients undergoing surgery within 1 month after drug-eluting stent placement compared with controls without heart disease and patients with stent placement longer than 1 month before surgery.
Our practice is to continue aspirin for surgery in patients with coronary stents regardless of the timing of placement. Although there is a small increased risk of bleeding, this must be balanced against thrombotic risk. We typically stop clopidogrel 5 to 7 days before surgery and ticagrelor 3 to 5 days before surgery. We may decide to give platelets before very-high-risk surgery (eg, intracranial, spinal) if there is a decision to continue both antiplatelet drugs—for example, in a patient who recently received a drug-eluting stent (ie, within 3 months). It is essential to involve the cardiologist and surgeon in these decisions.
BOTTOM LINE
- Kearon C, Aki EA, Ornelas J, et al. Antithrombotic therapy for VTE disease: CHEST guideline and expert panel report. Chest 2016; 149:315–352.
- Enden T, Haig Y, Klow NE, et al; CaVenT Study Group. Long-term outcome after additional catheter-directed thrombolysis versus standard treatment for acute iliofemoral deep vein thrombosis (the CaVenT study): a randomised controlled trial. Lancet 2012; 379:31–38.
- Haig Y, Enden T, Grotta O, et al; CaVenT Study Group. Post-thrombotic syndrome after catheter-directed thrombolysis for deep vein thrombosis (CaVenT): 5-year follow-up results of an open-label, randomized controlled trial. Lancet Haematol 2016; 3:e64–e71.
- Vedantham S, Goldhaber SZ, Kahn SR, et al. Rationale and design of the ATTRACT Study: a multicenter randomized trial to evaluate pharmacomechanical catheter-directed thrombolysis for the prevention of postthrombotic syndrome in patients with proximal deep vein thrombosis. Am Heart J 2013; 165:523–530.
- Van Es N, Coppens M, Schulman S, Middeldorp S, Buller HR. Direct oral anticoagulants compared with vitamin K antagonists for acute venous thromboembolism: evidence from phase 3 trials. Blood 2014; 124:1968–1975.
- Douketis JD, Spyropoulos AC, Kaatz S, et al; BRIDGE Investigators. Perioperative bridging anticoagulation in patients with atrial fibrillation. N Engl J Med 2015; 373:823–833.
- Douketis J, Johnson JA, Turpie AG. Low-molecular-weight heparin as bridging anticoagulation during interruption of warfarin: assessment of a standardized periprocedural anticoagulation regimen. Arch Intern Med 2004; 164:1319–1326.
- Dunn AS, Spyropoulos AC, Turpie AG. Bridging therapy in patients on long-term oral anticoagulants who require surgery: the Prospective Peri-operative Enoxaparin Cohort Trial (PROSPECT). J Thromb Haemost 2007; 5:2211–2218.
- Kovacs MJ, Kearon C, Rodger M, et al. Single-arm study of bridging therapy with low-molecular-weight heparin for patients at risk of arterial embolism who require temporary interruption of warfarin. Circulation 2004; 110:1658–1663.
- Spyropoulos AC, Turpie AG, Dunn AS, et al; REGIMEN Investigators. Clinical outcomes with unfractionated heparin or low-molecular-weight heparin as bridging therapy in patients on long-term oral anticoagulants: the REGIMEN registry. J Thromb Haemost 2006; 4:1246–1252.
- Douketis JD, Woods K, Foster GA, Crowther MA. Bridging anticoagulation with low-molecular-weight heparin after interruption of warfarin therapy is associated with a residual anticoagulant effect prior to surgery. Thromb Haemost 2005; 94:528–531.
- Schulman S, Hwang HG, Eikelboom JW, Kearon C, Pai M, Delaney J. Loading dose vs. maintenance dose of warfarin for reinitiation after invasive procedures: a randomized trial. J Thromb Haemost 2014; 12:1254-1259.
- Siegal D, Yudin J, Kaatz S, Douketis JD, Lim W, Spyropoulos AC. Periprocedural heparin bridging in patients receiving vitamin K antagonists: systematic review and meta-analysis of bleeding and thromboembolic rates. Circulation 2012; 126:1630–1639.
- Skeith L, Taylor J, Lazo-Langner A, Kovacs MJ. Conservative perioperative anticoagulation management in patients with chronic venous thromboembolic disease: a cohort study. J Thromb Haemost 2012; 10:2298–2304.
- Douketis JD, Spyropoulos AC, Spencer FA, et al. Perioperative management of antithrombotic therapy: Antithrombotic Therapy and Prevention of Thrombosis, 9th ed: American College of Chest Physicians evidence-based clinical practice guidelines. Chest 2012; 141(2 suppl):e326S–e350S.
- Doherty JU, Gluckman TJ, Hucker WJ, et al. 2017 ACC expert consensus decision pathway for periprocedural management of anticoagulation in patients with nonvalvular atrial fibrillation: a report of the American College of Cardiology Clinical Expert Consensus Document Task Force. J Am Coll Cardiol 2017; 69:871–898.
- Raval AN, Cigarroa JE, Chung MK, et al; American Heart Association Clinical Pharmacology Subcommittee of the Acute Cardiac Care and General Cardiology Committee of the Council on Clinical Cardiology; Council on Cardiovascular Disease in the Young; and Council on Quality of Care and Outcomes Research. Management of patients on non-vitamin K antagonist oral anticoagulants in the acute care and periprocedural setting: a scientific statement from the American Heart Association. Circulation 2017; 135:e604–e633.
- Tafur A, Douketis J. Perioperative anticoagulant management in patients with atrial fibrillation: practical implications of recent clinical trials. Pol Arch Med Wewn 2015; 125:666–671.
- Birnie DH, Healey JS, Wells GA, et al: BRUISE CONTROL Investigators. Pacemaker or defibrillator surgery without interruption of anticoagulation. N Engl J Med 2013; 368:2084–2093.
- Frost C, Song Y, Barrett YC, et al. A randomized direct comparison of the pharmacokinetics and pharmacodynamics of apixaban and rivaroxaban. Clin Pharmacol 2014; 6:179–187.
- Narouze S, Benzon HT, Provenzano DA, et al. Interventional spine and pain procedures in patients on antiplatelet and anticoagulant medications: guidelines from the American Society of Regional Anesthesia and Pain Medicine, the European Society of Regional Anesthesia and Pain Therapy, the American Academy of Pain Medicine, the International Neuromodulation Society, the North American Neuromodulation Society, and the World institute of Pain. Reg Anesth Pain Med 2015; 40:182–212.
- Douketis JD, Healey JS, Brueckmann M, et al. Perioperative bridging anticoagulation during dabigatran or warfarin interruption among patients who had an elective surgery or procedure. Substudy of the RE-LY trial. Thromb Haemost 2015; 113:625–632.
- Steinberg BA, Peterson ED, Kim S, et al; Outcomes Registry for Better Informed Treatment of Atrial Fibrillation Investigators and Patients. Use and outcomes associated with bridging during anticoagulation interruptions in patients with atrial fibrillation: findings from the Outcomes Registry for Better Informed Treatment of Atrial Fibrillation (ORBIT-AF). Circulation 2015; 131:488–494.
- Garcia D, Alexander JH, Wallentin L, et al. Management and clinical outcomes in patients treated with apixaban vs warfarin undergoing procedures. Blood 2014; 124:3692–3698.
- Doherty JU, Gluckman TJ, Hucker WJ, et al. 2017 ACC expert consensus decision pathway for periprocedural management of anticoagulation in patients with nonvalvular atrial fibrillation: a report of the American College of Cardiology Clinical Expert Consensus Document Task Force. J Am Coll Cardiol 2017; 69:871–898.
- Thrombosis Canada. NOACs/DOACs: Peri-operative management. http://thrombosiscanada.ca/?page_id=18#. Accessed August 30, 2017.
- Schulman S, Carrier M, Lee AY, et al; Periop Dabigatran Study Group. Perioperative management of dabigatran: a prospective cohort study. Circulation 2015; 132:167–173.
- Douketis JD, Wang G, Chan N, et al. Effect of standardized perioperative dabigatran interruption on the residual anticoagulation effect at the time of surgery or procedure. J Thromb Haemost 2016; 14:89–97.
- Douketis JD, Syed S, Schulman S. Periprocedural management of direct oral anticoagulants: comment on the 2015 American Society of Regional Anesthesia and Pain Medicine guidelines. Reg Anesth Pain Med 2016; 41:127–129.
- Devereaux PJ, Mrkobrada M, Sessler DI, et al; POISE-2 Investigators. Aspirin in patients undergoing noncardiac surgery. N Engl J Med 2014; 370:1494–1503.
- Holcomb CN, Graham LA, Richman JS, et al. The incremental risk of noncardiac surgery on adverse cardiac events following coronary stenting. J Am Coll Cardiol 2014; 64:2730–2739.
- Egholm G, Kristensen SD, Thim T, et al. Risk associated with surgery within 12 months after coronary drug-eluting stent implantation. J Am Coll Cardiol 2016; 68:2622–2632.
- Kearon C, Aki EA, Ornelas J, et al. Antithrombotic therapy for VTE disease: CHEST guideline and expert panel report. Chest 2016; 149:315–352.
- Enden T, Haig Y, Klow NE, et al; CaVenT Study Group. Long-term outcome after additional catheter-directed thrombolysis versus standard treatment for acute iliofemoral deep vein thrombosis (the CaVenT study): a randomised controlled trial. Lancet 2012; 379:31–38.
- Haig Y, Enden T, Grotta O, et al; CaVenT Study Group. Post-thrombotic syndrome after catheter-directed thrombolysis for deep vein thrombosis (CaVenT): 5-year follow-up results of an open-label, randomized controlled trial. Lancet Haematol 2016; 3:e64–e71.
- Vedantham S, Goldhaber SZ, Kahn SR, et al. Rationale and design of the ATTRACT Study: a multicenter randomized trial to evaluate pharmacomechanical catheter-directed thrombolysis for the prevention of postthrombotic syndrome in patients with proximal deep vein thrombosis. Am Heart J 2013; 165:523–530.
- Van Es N, Coppens M, Schulman S, Middeldorp S, Buller HR. Direct oral anticoagulants compared with vitamin K antagonists for acute venous thromboembolism: evidence from phase 3 trials. Blood 2014; 124:1968–1975.
- Douketis JD, Spyropoulos AC, Kaatz S, et al; BRIDGE Investigators. Perioperative bridging anticoagulation in patients with atrial fibrillation. N Engl J Med 2015; 373:823–833.
- Douketis J, Johnson JA, Turpie AG. Low-molecular-weight heparin as bridging anticoagulation during interruption of warfarin: assessment of a standardized periprocedural anticoagulation regimen. Arch Intern Med 2004; 164:1319–1326.
- Dunn AS, Spyropoulos AC, Turpie AG. Bridging therapy in patients on long-term oral anticoagulants who require surgery: the Prospective Peri-operative Enoxaparin Cohort Trial (PROSPECT). J Thromb Haemost 2007; 5:2211–2218.
- Kovacs MJ, Kearon C, Rodger M, et al. Single-arm study of bridging therapy with low-molecular-weight heparin for patients at risk of arterial embolism who require temporary interruption of warfarin. Circulation 2004; 110:1658–1663.
- Spyropoulos AC, Turpie AG, Dunn AS, et al; REGIMEN Investigators. Clinical outcomes with unfractionated heparin or low-molecular-weight heparin as bridging therapy in patients on long-term oral anticoagulants: the REGIMEN registry. J Thromb Haemost 2006; 4:1246–1252.
- Douketis JD, Woods K, Foster GA, Crowther MA. Bridging anticoagulation with low-molecular-weight heparin after interruption of warfarin therapy is associated with a residual anticoagulant effect prior to surgery. Thromb Haemost 2005; 94:528–531.
- Schulman S, Hwang HG, Eikelboom JW, Kearon C, Pai M, Delaney J. Loading dose vs. maintenance dose of warfarin for reinitiation after invasive procedures: a randomized trial. J Thromb Haemost 2014; 12:1254-1259.
- Siegal D, Yudin J, Kaatz S, Douketis JD, Lim W, Spyropoulos AC. Periprocedural heparin bridging in patients receiving vitamin K antagonists: systematic review and meta-analysis of bleeding and thromboembolic rates. Circulation 2012; 126:1630–1639.
- Skeith L, Taylor J, Lazo-Langner A, Kovacs MJ. Conservative perioperative anticoagulation management in patients with chronic venous thromboembolic disease: a cohort study. J Thromb Haemost 2012; 10:2298–2304.
- Douketis JD, Spyropoulos AC, Spencer FA, et al. Perioperative management of antithrombotic therapy: Antithrombotic Therapy and Prevention of Thrombosis, 9th ed: American College of Chest Physicians evidence-based clinical practice guidelines. Chest 2012; 141(2 suppl):e326S–e350S.
- Doherty JU, Gluckman TJ, Hucker WJ, et al. 2017 ACC expert consensus decision pathway for periprocedural management of anticoagulation in patients with nonvalvular atrial fibrillation: a report of the American College of Cardiology Clinical Expert Consensus Document Task Force. J Am Coll Cardiol 2017; 69:871–898.
- Raval AN, Cigarroa JE, Chung MK, et al; American Heart Association Clinical Pharmacology Subcommittee of the Acute Cardiac Care and General Cardiology Committee of the Council on Clinical Cardiology; Council on Cardiovascular Disease in the Young; and Council on Quality of Care and Outcomes Research. Management of patients on non-vitamin K antagonist oral anticoagulants in the acute care and periprocedural setting: a scientific statement from the American Heart Association. Circulation 2017; 135:e604–e633.
- Tafur A, Douketis J. Perioperative anticoagulant management in patients with atrial fibrillation: practical implications of recent clinical trials. Pol Arch Med Wewn 2015; 125:666–671.
- Birnie DH, Healey JS, Wells GA, et al: BRUISE CONTROL Investigators. Pacemaker or defibrillator surgery without interruption of anticoagulation. N Engl J Med 2013; 368:2084–2093.
- Frost C, Song Y, Barrett YC, et al. A randomized direct comparison of the pharmacokinetics and pharmacodynamics of apixaban and rivaroxaban. Clin Pharmacol 2014; 6:179–187.
- Narouze S, Benzon HT, Provenzano DA, et al. Interventional spine and pain procedures in patients on antiplatelet and anticoagulant medications: guidelines from the American Society of Regional Anesthesia and Pain Medicine, the European Society of Regional Anesthesia and Pain Therapy, the American Academy of Pain Medicine, the International Neuromodulation Society, the North American Neuromodulation Society, and the World institute of Pain. Reg Anesth Pain Med 2015; 40:182–212.
- Douketis JD, Healey JS, Brueckmann M, et al. Perioperative bridging anticoagulation during dabigatran or warfarin interruption among patients who had an elective surgery or procedure. Substudy of the RE-LY trial. Thromb Haemost 2015; 113:625–632.
- Steinberg BA, Peterson ED, Kim S, et al; Outcomes Registry for Better Informed Treatment of Atrial Fibrillation Investigators and Patients. Use and outcomes associated with bridging during anticoagulation interruptions in patients with atrial fibrillation: findings from the Outcomes Registry for Better Informed Treatment of Atrial Fibrillation (ORBIT-AF). Circulation 2015; 131:488–494.
- Garcia D, Alexander JH, Wallentin L, et al. Management and clinical outcomes in patients treated with apixaban vs warfarin undergoing procedures. Blood 2014; 124:3692–3698.
- Doherty JU, Gluckman TJ, Hucker WJ, et al. 2017 ACC expert consensus decision pathway for periprocedural management of anticoagulation in patients with nonvalvular atrial fibrillation: a report of the American College of Cardiology Clinical Expert Consensus Document Task Force. J Am Coll Cardiol 2017; 69:871–898.
- Thrombosis Canada. NOACs/DOACs: Peri-operative management. http://thrombosiscanada.ca/?page_id=18#. Accessed August 30, 2017.
- Schulman S, Carrier M, Lee AY, et al; Periop Dabigatran Study Group. Perioperative management of dabigatran: a prospective cohort study. Circulation 2015; 132:167–173.
- Douketis JD, Wang G, Chan N, et al. Effect of standardized perioperative dabigatran interruption on the residual anticoagulation effect at the time of surgery or procedure. J Thromb Haemost 2016; 14:89–97.
- Douketis JD, Syed S, Schulman S. Periprocedural management of direct oral anticoagulants: comment on the 2015 American Society of Regional Anesthesia and Pain Medicine guidelines. Reg Anesth Pain Med 2016; 41:127–129.
- Devereaux PJ, Mrkobrada M, Sessler DI, et al; POISE-2 Investigators. Aspirin in patients undergoing noncardiac surgery. N Engl J Med 2014; 370:1494–1503.
- Holcomb CN, Graham LA, Richman JS, et al. The incremental risk of noncardiac surgery on adverse cardiac events following coronary stenting. J Am Coll Cardiol 2014; 64:2730–2739.
- Egholm G, Kristensen SD, Thim T, et al. Risk associated with surgery within 12 months after coronary drug-eluting stent implantation. J Am Coll Cardiol 2016; 68:2622–2632.
KEY POINTS
- Venous thromboembolism has a myriad of clinical presentations, warranting a holistic management approach that incorporates multiple antithrombotic management strategies.
- A direct oral anticoagulant is an acceptable treatment option in patients with submassive venous thromboembolism, whereas catheter-directed thrombolysis should be considered in patients with iliofemoral deep vein thrombosis, and low-molecular-weight heparin in patients with cancer-associated thrombosis.
- Perioperative management of direct oral anticoagulants should be based on the pharmacokinetic properties of the drug, the patient’s renal function, and the risk of bleeding posed by the surgery or procedure.
- Perioperative heparin bridging can be avoided in most patients who have atrial fibrillation or venous thromboembolism, but should be considered in most patients with a mechanical heart valve.
Diabetes medications and cardiovascular outcome trials: Lessons learned
Since 2008, the US Food and Drug Administration (FDA) has required new diabetes drugs to demonstrate cardiovascular safety, resulting in large and lengthy clinical trials. Under the new regulations, several dipeptidyl peptidase-4 (DPP-4) inhibitors, sodium-glucose cotransporter-2 (SGLT-2) inhibitors, and glucagon-like peptide-1 (GLP-1) receptor agonists have demonstrated cardiovascular safety, with some demonstrating superior cardiovascular efficacy. In 2016, the SGLT-2 inhibitor empagliflozin became the first (and as of this writing, the only) diabetes drug approved by the FDA for a clinical outcome indication, ie, to reduce the risk of cardiovascular death.
DIABETES DRUG DEVELOPMENT
Changing priorities
The International Council for Harmonization of Technical Requirements for Pharmaceuticals for Human Use (ICH) was formed in 1990 as a collaborative effort across global regulatory agencies and coordinated by the World Health Organization to universalize criteria for drug development. The ICH standards for type 2 diabetes drug development included the following requirements for patient exposure to investigational products to satisfy new drug application requirements:
- 1,500 individuals total (including single-dose exposure)
- 300–600 patients for 6 months
- 100 patients for 1 year.
Thus, just 250 patient-years of exposure were needed for approval of a drug that patients might take for decades. These standards were unlikely to reveal rare, serious complications and had no ability to assess clinical outcomes efficacy for either microvascular or macrovascular disease complications.
Since 1995, when metformin was approved in the United States, a new class of antihyperglycemic medication has been approved about once every 2 years, so that by 2008, 12 classes of medications had become available for the treatment of type 2 diabetes. This extraordinary rate of drug development has now yielded more classes of medications to treat type 2 diabetes than we presently have for the treatment of hypertension.
This proliferation of new treatments resolved much of the pressure of the unmet medical need, over a period of increasing awareness of the cardiovascular complications of type 2 diabetes, along with numerous examples of adverse cardiovascular effects observed with some of the drugs. In this context, the FDA (and in parallel the European Medicines Agency) made paradigm-shifting changes in the requirements for the development of new type 2 diabetes drugs, requiring large-scale randomized clinical outcome data to assess cardiovascular safety of the new drugs. In December 2008, the FDA published a Guidance for Industry,1 recommending that sponsors of new drugs for type 2 diabetes demonstrate that therapy would not only improve glucose control, but also that it would, at a minimum, not result in an unacceptable increase in cardiovascular risk.1 To better assess new diabetes drugs, the requirement for patient-years of exposure to the studied drug was increased by over 60-fold from 250 patient-years to more than 15,000.
INCRETIN MODULATORS
The incretin system, a regulator of postprandial glucose metabolism, is an attractive target for glycemic control, as it promotes early satiety and lowers blood glucose.
After a meal, endocrine cells in the distal small intestine secrete the incretin hormones GLP-1 and gastric inhibitory polypeptide (GIP), among others, which reduce gastric motility, stimulate the pancreas to augment glucose-appropriate insulin secretion, and decrease postprandial glucagon release. GLP-1 also interacts with the satiety center of the hypothalamus, suppressing appetite. GLP-1 and GIP are rapidly inactivated by the circulating protease DPP-4. Injectable formulations of GLP-1 receptor agonists that are resistant to DPP-4 degradation have been developed.
Ten incretin modulators are now available in the United States. The 4 available DPP-4 inhibitors are all once-daily oral medications, and the 6 GLP-1 receptor agonists are all injectable (Table 1).
Small studies in humans and animals suggest that DPP-4 inhibitors and GLP-1-receptor agonists may have multiple favorable effects on the cardiovascular system independent of their glycemic effects. These include reducing myocardial infarct size,2–5 improving endothelial function,6 reducing inflammation and oxidative stress,7 reducing atherosclerotic plaque volume,8 improving left ventricular function, 9,10 and lowering triglyceride levels.11 However, large clinical trials are needed to determine clinical effectiveness.
DPP-4 INHIBITORS: NOT INFERIOR TO PLACEBO
Saxagliptin
Saxagliptin, a DPP-4 inhibitor, was found in a meta-analysis of phase 2B and early phase 3 trial data involving almost 5,000 patients to be associated with a dramatic 56% relative risk reduction in cardiovascular death, heart attack, and stroke. However, this analysis was limited by the extremely low number of events to analyze, with only 41 total patients with cardiovascular events in that dataset.12
The SAVOR-TIMI 53 trial13 subsequently compared saxagliptin and placebo in a randomized, double-blind trial conducted in 26 countries with nearly 16,500 patients with type 2 diabetes. All patients continued their conventional diabetes treatment at the discretion of their physicians.
During an average follow-up of 2 years, 1,222 events of cardiovascular death, myocardial infarction, or stroke occurred. No significant difference in event rates was found between the saxagliptin and placebo groups. This did not demonstrate the expected cardiovascular benefit based on prior meta-analysis of phase 2B and phase 3 data presented above, but saxagliptin did not increase cardiovascular risk and was the first diabetes drug to earn this distinction of robustly statistically proven cardiovascular safety.
Further analysis of the SAVOR-TIMI 53 trial data revealed a 27% increased relative risk of heart failure hospitalization with saxagliptin compared with placebo.14 Although the risk was statistically significant, the absolute difference in heart failure incidence between the drug and placebo groups was only 0.7% (3.5% vs 2.8%, respectively). As the average follow-up in the trial was 2 years, the absolute incremental risk of heart failure seen with saxagliptin is 0.35% annually—almost identical in magnitude to the increased heart failure risk with pioglitazone. The increased risk of heart failure was seen within the first 6 months of the trial and persisted throughout the trial, indicating an increased up-front risk of heart failure.
Alogliptin
The EXAMINE trial15 compared the DPP-4 inhibitor alogliptin and placebo in 5,380 patients with type 2 diabetes who had had a recent acute coronary event.15 Over the 30 months of the trial, more than 600 primary outcome events of cardiovascular death, myocardial infarction, or stroke occurred, with no significant difference between drug and placebo groups with established nominal statistical noninferiority. A numerically higher incidence of heart failure was noted in patients who received alogliptin than with placebo, but the difference was not statistically significant.16 However, this study was not powered to detect such an increased risk. In patients entering the trial with no history of heart failure, the risk of hospitalization for heart failure was 76% higher in the alogliptin group than in the placebo group, with a nominally significant P value less than .05 in this subgroup.
These analyses led the FDA in 2016 to mandate label warnings for saxagliptin and alogliptin regarding the increased risk of heart failure.17
Sitagliptin
The TECOS trial18 tested the DPP-4 inhibitor sitagliptin and, unlike the SAVOR or EXAMINE trials, included hospitalization for unstable angina in the composite end point. Nearly 15,000 patients with type 2 diabetes and established cardiovascular disease were enrolled, and almost 2,500 events occurred. No significant difference was found between the 2 groups.
In a series of analyses prospectively planned, sitagliptin was not associated with an increased risk of hospitalization for heart failure.19 But despite these robust analyses demonstrating no incremental heart failure risk with sitagliptin, in August 2017, the US product label for sitagliptin was modified to include a warning that other DPP-4 inhibitors have been associated with heart failure and to suggest caution. The label for linagliptin had the same FDA-required changes, with no data yet available from outcomes trials with linagliptin.
GLP-1 RECEPTOR AGONISTS
Lixisenatide: Noninferior to placebo
The ELIXA trial20 assessed the cardiovascular safety of the GLP-1 receptor agonist lixisenatide in patients with type 2 diabetes who recently had an acute coronary event. The study enrolled 6,068 patients from 49 countries, and nearly 1,000 events (cardiovascular death, myocardial infarction, stroke, or unstable angina) occurred during the median 25 months of the study. Results showed lixisenatide did not increase or decrease cardiovascular events or adverse events when compared with placebo.
Liraglutide: Evidence of benefit
The LEADER trial21 randomized 9,340 patients with or at increased risk for cardiovascular disease to receive the injectable GLP-1 receptor agonist liraglutide or placebo. After a median of 3.8 years of follow-up, liraglutide use was associated with a statistically significant 13% relative reduction in major adverse cardiovascular events, mostly driven by a 22% reduction in cardiovascular death.
Semaglutide: Evidence of benefit
The SUSTAIN-6 trial22 found a statistically significant 26% relative risk reduction in cardiovascular outcomes comparing once-weekly semaglutide (an injectable GLP-1 receptor agonist) and placebo in 3,297 patients with type 2 diabetes and established cardiovascular disease, chronic kidney disease, or risk factors for cardiovascular disease. The significant reduction in the incidence of nonfatal stroke with semaglutide was the main driver of the observed benefit.
Taspoglutide: Development halted
Taspoglutide was a candidate GLP-1 receptor agonist that underwent clinical trials for cardiovascular outcomes planned to involve about 8,000 patients. The trials were stopped early and drug development was halted after about 600 patient-years of exposure because of antibody formation in about half of patients exposed to taspoglutide, with anaphylactoid reactions and anaphylaxis reported.23
SGLT-2 INHIBITORS
The renal glomeruli filter about 180 g of glucose every day in normal adults; nearly all of it is reabsorbed by SGLT-2 in the proximal tubules, so that very little glucose is excreted in the urine.24–26 The benign condition hereditary glucosuria occurs due to loss-of-function mutations in the gene for SGLT-2. Individuals with this condition rarely if ever develop type 2 diabetes or obesity, and this observation led pharmaceutical researchers to probe SGLT-2 as a therapeutic target.
Inhibitors of SGLT-2 block glucose reabsorption in the renal proximal tubules and lead to glucosuria. Patients treated with an SGLT-2 inhibitor have lower serum glucose levels and lose weight. Inhibitors also reduce sodium reabsorption via SGLT-2 and lead to increased sodium excretion and decreased blood pressure.27
Three SGLT-2 antagonists are available in the United States: canagliflozin, dapagliflozin, and empagliflozin (Table 1). Ertugliflozin is currently in a phase 3B trial, and cardiovascular outcomes trials are in the planning phase for sotagliflozin, a dual SGLT-1/SGLT-2 inhibitor with SGLT-1 localized to the gastrointestinal tract.28
Empaglifozin: Evidence of benefit
The EMPA-REG OUTCOME trial29 randomized more than 7,200 patients with type 2 diabetes and atherosclerotic vascular disease to receive the SGLT-2 inhibitor empagliflozin or placebo as once-daily tablets, with both groups receiving off-study treatment for glycemic control at the discretion of their own care providers. Two doses of empagliflozin were evaluated in the trial (10 and 25 mg per day), with the 2 dosing groups pooled for all analyses as prospectively planned.
Patients taking empagliflozin had a 14% relative risk reduction of the composite outcome (cardiovascular death, myocardial infarction, and stroke) vs placebo, with no difference in effect between the 2 randomized doses. The improvement in the composite outcome was seen early in the empagliflozin group and persisted for the 4 years of the study.
This was the first trial of newly developed diabetes drugs that showed a statistically significant reduction in cardiovascular risk. The study revealed a 38% relative risk reduction in cardiovascular death in the treatment group. The risk reduction occurred early in the trial and improved throughout the duration of the study. This is a dramatic finding, unequaled even in trials of drugs that specifically target cardiovascular disease. Both doses of empagliflozin studied provided similar benefit over placebo, reinforcing the validity of the findings. Interestingly, in the empagliflozin group, there was a 35% relative risk reduction in heart failure hospitalizations.
Canaglifozin: Evidence of benefit
The CANVAS Program consisted of two sister trials, CANVAS and CANVAS-R, and examined the safety and efficacy of canagliflozin.30 More than 10,000 participants with type 2 diabetes and atherosclerotic disease or at increased risk of cardiovascular disease were randomized to receive canagliflozin or placebo. Canagliflozin led to a 14% relative risk reduction in the composite outcome of cardiovascular death, nonfatal myocardial infarction, or nonfatal stroke, but there was a statistically significant doubling in the incidence of amputations. Unlike empagliflozin, canagliflozin did not demonstrate a significant reduction in death from cardiovascular causes, suggesting that this may not be a class effect of SGLT-2 inhibitors. As with empagliflozin, canagliflozin led to a 33% relative risk reduction in heart failure hospitalizations.
Cardiovascular benefits independent of glucose-lowering
The cardiovascular benefits of empagliflozin in EMPA-REG OUTCOME and canagliflozin in CANVAS were observed early, suggesting that the mechanism may be due to the direct effects on the cardiovascular system rather than glycemic modification.
Improved glycemic control with the SGLT-2 inhibitor was seen early in both studies, but with the trials designed for glycemic equipoise encouraging open-label therapy targeting hemoglobin A1c to standard-of-care targets in both groups, the contrast in hemoglobin A1c between groups diminished throughout the trial after its first assessment. Although hemoglobin A1c levels in the SGLT-2 inhibitor groups decreased in the first 12 weeks, they increased over time nearly to the level seen in the placebo group. The adjusted mean hemoglobin A1c level in the placebo groups remained near 8.0% throughout the studies, a target consistent with guidelines from the American Diabetes Association and the European Association for the Study of Diabetes31 for the high-risk populations recruited and enrolled.
Blood pressure reduction and weight loss do not explain cardiovascular benefits
SGLT-2 inhibitors lower blood pressure independent of their diuretic effects. In the EMPA-REG OUTCOME trial, the adjusted mean systolic blood pressure was 3 to 4 mm Hg lower in the treatment groups than in the placebo group throughout the trial.29 This level of blood pressure lowering translates to an estimated 10% to 12% relative risk reduction for major adverse cardiovascular events, including heart failure. Although the risk reduction from blood pressure lowering is not insignificant, it does not explain the 38% reduction in cardiovascular deaths seen in the trial. Canagliflozin led to a similar 4-mm Hg reduction in systolic pressure compared with the placebo group.30
Weight loss was seen with both empagliflozin and canagliflozin but was not dramatic and is unlikely to account for the described cardiovascular benefits.
Theories of cardiovascular benefit
Several mechanisms have been proposed to help explain the observed cardiovascular benefits of SGLT-2 inhibitors.32
Ketone-body elevation. Ferrannini et al33 found that the blood concentration of the ketone-body beta-hydroxybutyrate is about twice as high in patients with type 2 diabetes in the fasting state who are chronically taking empagliflozin as in patients not receiving the drug. Beta-hydroxybutyrate levels peak after a meal and then return to baseline over several hours before rising again during the fasting period. Although the ketone elevation is not nearly as extreme as in diabetic ketoacidosis (about a 1,000-fold increase), the observed increase may reduce myocardial oxygen demand, as beta-hydroxybutyrate is among the most efficient metabolic substrates for the myocardium.
Red blood cell expansion. Perhaps a more likely explanation of the cardiovascular benefit seen with SGLT-2 inhibitor therapy is the increase in hemoglobin and hematocrit levels. At first attributed to hemoconcentration secondary to diuresis, this has been disproven by a number of studies. The EMPA-REG OUTCOME trial29 found that within 12 weeks of exposure to empagliflozin, hematocrit levels rose nearly 4% absolutely compared with the levels in the placebo group. This increase is equivalent to transfusing a unit of red blood cells, favorably affecting myocardial oxygen supply.
Reduction in glomerular hypertension. The kidneys regulate glomerular filtration in a process involving the macula densa, an area of specialized cells in the juxtaglomerular apparatus in the loop of Henle that responds to sodium concentration in the urine. Normally, SGLT-2 receptors upstream from the loop of Henle reabsorb sodium and glucose into the bloodstream, reducing sodium delivery to the macula densa, which senses this as a low-volume state. The macula densa cells respond by releasing factors that dilate afferent arterioles and increase glomerular filtration. People with diabetes have more glucose to reabsorb and therefore also reabsorb more sodium, leading to glomerular hypertension.
SGLT-2 inhibitors block both glucose and sodium reuptake at SGLT-2 receptors, normalizing the response at the macula densa, restoring a normal glomerular filtration rate, and alleviating glomerular hypertension. As the kidney perceives a more normal volume status, renin-angiotensin-aldosterone stimulation is attenuated and sympathetic nervous system activity improves.27,34 If this model of SGLT-2 inhibitor effects on the kidney is correct, these drugs have similar effects as angiotensin-converting enzyme (ACE) inhibitors, angiotensin II receptor blockers (ARBs), mineralocorticoid antagonists, and beta-blockers combined.
Kidney benefits
Empagliflozin35 and canagliflozin30 both reduced the rate of progression of kidney dysfunction and led to fewer clinically relevant renal events compared with placebo. Treatment and placebo groups also received standard care, so many patients were treated with renin-angiotensin-aldosterone system inhibitors and with good blood pressure control, making the finding that SGLT-2 inhibitors had a significant beneficial effect even more dramatic. Beneficial effects on markers of kidney function were seen early on, suggesting a more favorable hemodynamic effect on the kidney rather than improved glycemic control attenuating microvascular disease.
Empagliflozin approved to reduce clinical events
In December 2016, the FDA approved the indication for empagliflozin to reduce the risk of cardiovascular death in patients with type 2 diabetes,36 the first-ever clinical outcome indication for a type 2 diabetes medication. The European Society of Cardiology guidelines now include empagliflozin as preferred therapy for type 2 diabetes, recommending it to prevent the onset of heart failure and prolong life.37 This recommendation goes beyond the evidence from the EMPA-REG OUTCOME trial on which it is based, as the trial only studied patients with known atherosclerotic vascular disease.
The 2016 European Guidelines on cardiovascular disease prevention also recommend that an SGLT-2 inhibitor be considered early for patients with type 2 diabetes and cardiovascular disease to reduce cardiovascular and total mortality.38 The American Diabetes Association in their 2017 guidelines also endorse empagliflozin for treating patients with type 2 diabetes and cardiovascular disease.39 The fact that the American Diabetes Association recommendation is not based on glycemic control, in line with the product-labeled indication, is a major shift in the association’s guidance.
Cautions with SGLT-2 inhibitors
- Use SGLT-2 inhibitors in patients with low blood pressure with caution, and with increased blood pressure monitoring just following initiation.
- Consider modifying antihypertensive drugs in patients with labile blood pressure.
- Consider stopping or reducing background diuretics when starting an SGLT-2 inhibitor, and reassess volume status after 1 to 2 weeks.
- For patients on insulin, sulfonylureas, or both, consider decreasing dosages when starting an SGLT-2 inhibitor, and reassess glycemic control periodically.
- Counsel patients about urinary hygiene. Although bacterial urinary tract infections have not emerged as a problem, fungal genital infections have, particularly in women and uncircumcised men.
- Consider SGLT-2 inhibitors to be “sick-day” medications. Patients with diabetes must adjust their diabetes medications if their oral intake is reduced for a day or more, such as while sick or fasting. SGLT-2 inhibitors should not be taken on these days. Cases of diabetic ketoacidosis have arisen in patients who reduced oral intake while continuing their SGLT-2 inhibitor.
OTHER DRUGS WITH DEVELOPMENT HALTED
Aleglitazar, a peroxisome proliferator-activated receptor agonist taken orally once daily, raised high expectations when it was found in early studies to lower serum triglycerides and raise high-density lipoprotein cholesterol levels in addition to lowering blood glucose. However, a phase 3 trial in more than 7,000 patients was terminated after a median follow up of 2 years because of increased rates of heart failure, worsened kidney function, bone fractures, and gastrointestinal bleeding.40 Development of this drug was stopped.
Fasiglifam, a G-protein-coupled receptor 40 agonist, was tested in a cardiovascular clinical outcomes trial. Compared with placebo, fasiglifam reduced hemoglobin A1c levels with low risk of hypoglycemia.41 However, safety concerns about increased liver enzyme levels led to the cessation of the drug’s development.42
HOW WILL THIS AFFECT DIABETES MANAGEMENT?
Metformin is still the most commonly prescribed drug for type 2 diabetes but has only marginal evidence for its cardiovascular benefits and may not be the first-line therapy for the management of diabetes in the future. In the EMPA REG OUTCOME, LEADER, and SUSTAIN-6 trials, the novel diabetes medications were given to patients who were already treated with available therapies, often including metformin. Treatment with empagliflozin, liraglutide, and semaglutide may be indicated for patients with diabetes and atherosclerotic vascular disease as first-line therapies in the future.
SGLT-2 inhibitor therapy can cost about $500 per month, and GLP-1 inhibitors are only slightly less expensive. The cost may be prohibitive for many patients. As more evidence, guidelines, and FDA criteria support the use of these novel diabetes drugs, third-party payers and pharmaceutical companies may be motivated to lower costs to help reach more patients who can benefit from these therapies.
- US Food and Drug Administration. Guidance for industry. Diabetes mellitus—evaluating cardiovascular risk in new antidiabetic therapies to treat type 2 diabetes. www.fda.gov/downloads/Drugs/.../Guidances/ucm071627.pdf. Accessed September 1, 2017.
- Ye Y, Keyes KT, Zhang C, Perez-Polo JR, Lin Y, Birnbaum Y. The myocardial infarct size-limiting effect of sitagliptin is PKA-dependent, whereas the protective effect of pioglitazone is partially dependent on PKA. Am J Physiol Heart Circ Physiol 2010; 298:H1454–H1465.
- Hocher B, Sharkovska Y, Mark M, Klein T, Pfab T. The novel DPP-4 inhibitors linagliptin and BI 14361 reduce infarct size after myocardial ischemia/reperfusion in rats. Int J Cardiol 2013; 167:87–93.
- Woo JS, Kim W, Ha SJ, et al. Cardioprotective effects of exenatide in patients with ST-segment-elevation myocardial infarction undergoing primary percutaneous coronary intervention: results of exenatide myocardial protection in revascularization study. Arterioscler Thromb Vasc Biol 2013; 33:2252–2260.
- Lønborg J, Vejlstrup N, Kelbæk H, et al. Exenatide reduces reperfusion injury in patients with ST-segment elevation myocardial infarction. Eur Heart J 2012; 33:1491–1499.
- van Poppel PC, Netea MG, Smits P, Tack CJ. Vildagliptin improves endothelium-dependent vasodilatation in type 2 diabetes. Diabetes Care 2011; 34:2072–2077.
- Kröller-Schön S, Knorr M, Hausding M, et al. Glucose-independent improvement of vascular dysfunction in experimental sepsis by dipeptidyl-peptidase 4 inhibition. Cardiovasc Res 2012; 96:140–149.
- Ta NN, Schuyler CA, Li Y, Lopes-Virella MF, Huang Y. DPP-4 (CD26) inhibitor alogliptin inhibits atherosclerosis in diabetic apolipoprotein E-deficient mice. J Cardiovasc Pharmacol 2011; 58:157–166.
- Sauvé M, Ban K, Momen MA, et al. Genetic deletion or pharmacological inhibition of dipeptidyl peptidase-4 improves cardiovascular outcomes after myocardial infarction in mice. Diabetes 2010; 59:1063–1073.
- Read PA, Khan FZ, Heck PM, Hoole SP, Dutka DP. DPP-4 inhibition by sitagliptin improves the myocardial response to dobutamine stress and mitigates stunning in a pilot study of patients with coronary artery disease. Circ Cardiovasc Imaging 2010; 3:195–201.
- Matikainen N, Mänttäri S, Schweizer A, et al. Vildagliptin therapy reduces postprandial intestinal triglyceride-rich lipoprotein particles in patients with type 2 diabetes. Diabetologia 2006; 49:2049–2057.
- Frederich R, Alexander JH, Fiedorek FT, et al. A systematic assessment of cardiovascular outcomes in the saxagliptin drug development program for type 2 diabetes. Postgrad Med 2010; 122:16–27.
- Scirica BM, Bhatt DL, Braunwald E, et al; SAVOR-TIMI 53 Steering Committee and Investigators. Saxagliptin and cardiovascular outcomes in patients with type 2 diabetes mellitus. N Engl J Med 2013; 369:1317–1326.
- Scirica BM, Braunwald E, Raz I, et al; SAVOR-TIMI 53 Steering Committee and Investigators. Heart failure, saxagliptin, and diabetes mellitus: observations from the SAVOR-TIMI 53 randomized trial. Circulation 2014; 130:1579–1588.
- White WB, Cannon CP, Heller SR, et al; EXAMINE Investigators. Alogliptin after acute coronary syndrome in patients with type 2 diabetes. N Engl J Med 2013; 369:1327–1335.
- Zannad F, Cannon CP, Cushman WC, et al; EXAMINE Investigators. Heart failure and mortality outcomes in patients with type 2 diabetes taking alogliptin versus placebo in EXAMINE: a multicentre, randomised, double-blind trial. Lancet 2015; 385:2067–2076.
- US Food and Drug Administration. Diabetes medications containing saxagliptin and alogliptin: drug safety communication—risk of heart failure. https://www.fda.gov/safety/medwatch/safetyinformation/safetyalertsforhumanmedicalproducts/ucm494252.htm. Accessed August 23, 2017.
- Green JB, Bethel MA, Armstrong PW, et al; TECOS Study Group. Effect of sitagliptin on cardiovascular outcomes in type 2 diabetes. N Engl J Med 2015; 373:232–242.
- McGuire DK, Van de Werf F, Armstrong PW, et al; Trial Evaluating Cardiovascular Outcomes With Sitagliptin (TECOS) Study Group. Association between sitagliptin use and heart failure hospitalization and related outcomes in type 2 diabetes mellitus: secondary analysis of a randomized clinical trial. JAMA Cardiol 2016; 1:126–135.
- Pfeffer MA, Claggett B, Diaz R, et al; ELIXA Investigators. Lixisenatide in patients with type 2 diabetes and acute coronary syndrome. N Engl J Med 2015; 373:2247–2257.
- Marso SP, Daniels GH, Brown-Frandsen K, et al; LEADER Steering Committee; LEADER Trial Investigators. Liraglutide and cardiovascular outcomes in type 2 diabetes. N Engl J Med 2016; 375:311–322.
- Marso SP, Bain SC, Consoli A, et al; SUSTAIN-6 Investigators. Semaglutide and cardiovascular outcomes in patients with type 2 diabetes. N Engl J Med 2016; 375:1834–1844.
- Rosenstock J, Balas B, Charbonnel B, et al; T-EMERGE 2 Study Group. The fate of taspoglutide, a weekly GLP-1 receptor agonist, versus twice-daily exenatide for type 2 diabetes: the T-EMERGE 2 trial. Diabetes Care 2013; 36:498–504.
- Wright EM. Renal Na(+)-glucose cotransporters. Am J Physiol 2001; 280:F10–F18.
- Lee YJ, Lee YJ, Han HJ. Regulatory mechanisms of Na(+)/glucose cotransporters in renal proximal tubule cells. Kidney Int 2007; 72(suppl 106):S27–S35.
- Hummel CS, Lu C, Loo DD, Hirayama BA, Voss AA, Wright EM. Glucose transport by human renal Na+/D-glucose cotransporters SGLT1 and SGLT2. Am J Physiol Cell Physiol 2011; 300:C14–C21.
- Heerspink HJ, Perkins BA, Fitchett DH, Husain M, Cherney DZ. Sodium glucose cotransporter 2 inhibitors in the treatment of diabetes mellitus: cardiovascular and kidney effects, potential mechanisms, and clinical applications. Circulation 2016; 134:752–772.
- Lapuerta P, Zambrowicz, Strumph P, Sands A. Development of sotagliflozin, a dual sodium-dependent glucose transporter 1/2 inhibitor. Diabetes Vasc Dis Res 2015; 12:101–110.
- Zinman B, Wanner C, Lachin JM, et al, for the EMPA-REG OUTCOME Investigators. Empagliflozin, cardiovascular outcomes, and mortality in type 2 diabetes. N Engl J Med 2015; 373:2117–2128.
- Neal B, Vlado-Perkovic V, Mahaffey KW, et al, for the CANVAS Program Collaborative Group. Canagloflozin and cardiovascular and renal events in type 2 diabetes. N Engl J Med 2017; 377:644–657.
- Inzucchi SE, Bergenstal RM, Buse JB, et al. Management of hyperglycemia in type 2 diabetes, 2015: a patient-centered approach: update to a position statement of the American Diabetes Association and the European Association for the Study of Diabetes. Diabetes Care 2015; 38:140–149.
- Verma S, McMurray JJV, Cherney DZI. The metabolodiuretic promise of sodium-dependent glucose cotransporter 2 inhibition: the search for the sweet spot in heart failure. JAMA Cardiol. 2017:2(9):939-940. doi:10.1001/jamacardio.2017.1891.
- Ferrannini E, Mark M, Mayoux E. CV protection in the EMPA-REG OUTCOME trial: a “thrifty substrate” hypothesis. Diabetes Care 2016; 39:1108–1114.
- Cherney DZ, Perkins BA, Soleymanlou N, et al. Renal hemodynamic effect of sodium-glucose cotransporter 2 inhibition in patients with type 1 diabetes mellitus. Circulation 2014; 129:587–597.
- Wanner C, Inzucchi SE, Lachin JM, et al, for the EMPA-REG OUTCOME Investigators. Empagliflozin and progression of kidney disease in type 2 diabetes. N Engl J Med 2016; 375:323–334.
- US Food and Drug Administration. FDA News Release. FDA approves Jardiance to reduce cardiovascular death in adults with type 2 diabetes. https://www.fda.gov/newsevents/newsroom/pressannouncements/ucm531517.htm. Accessed August 23, 2017.
- Ponikowski P, Voors AA, Anker SD, et al; Authors/Task Force Members; Document Reviewers. 2016 ESC Guidelines for the diagnosis and treatment of acute and chronic heart failure: the Task Force for the diagnosis and treatment of acute and chronic heart failure of the European Society of Cardiology (ESC). Developed with the special contribution of the Heart Failure Association (HFA) of the ESC. Eur J Heart Fail 2016; 18:891–975.
- Piepoli MF, Hoes AW, Agewall S, et al; Authors/Task Force Members. 2016 European guidelines on cardiovascular disease prevention in clinical practice. The Sixth Joint Task Force of the European Society of Cardiology and Other Societies on Cardiovascular Disease Prevention in Clinical Practice (constituted by representatives of 10 societies and by invited experts). Developed with the special contribution of the European Association for Cardiovascular Prevention & Rehabilitation. Eur Heart J 2016; 37:2315–2381.
- American Diabetes Association. American Diabetes Association standards of medical care in diabetes. Diabetes Care 2017; 40(suppl 1):S1–S135.
- Lincoff AM, Tardif JC, Schwartz GG, et al; AleCardio Investigators. Effect of aleglitazar on cardiovascular outcomes after acute coronary syndrome in patients with type 2 diabetes mellitus: the AleCardio randomized clinical trial. JAMA 2014; 311:1515–1525.
- Kaku K, Enya K, Nakaya R, Ohira T, Matsuno R. Efficacy and safety of fasiglifam (TAK0*&%), a G protein-coupled receptor 40 agonist, in Japanese patients with type 2 diabetes inadequately controlled by diet and exercise: a randomized, double-blind, placebocontrolled, phase III trial. Diabetes Obes Metab 2015; 17: 675–681.
- Takeda Press Release. Takeda announces termination of fasiglifam (TAK-875) development. www.takeda.us/newsroom/press_release_detail.aspx?year=2013&id=296. Accessed September 9, 2017.
Since 2008, the US Food and Drug Administration (FDA) has required new diabetes drugs to demonstrate cardiovascular safety, resulting in large and lengthy clinical trials. Under the new regulations, several dipeptidyl peptidase-4 (DPP-4) inhibitors, sodium-glucose cotransporter-2 (SGLT-2) inhibitors, and glucagon-like peptide-1 (GLP-1) receptor agonists have demonstrated cardiovascular safety, with some demonstrating superior cardiovascular efficacy. In 2016, the SGLT-2 inhibitor empagliflozin became the first (and as of this writing, the only) diabetes drug approved by the FDA for a clinical outcome indication, ie, to reduce the risk of cardiovascular death.
DIABETES DRUG DEVELOPMENT
Changing priorities
The International Council for Harmonization of Technical Requirements for Pharmaceuticals for Human Use (ICH) was formed in 1990 as a collaborative effort across global regulatory agencies and coordinated by the World Health Organization to universalize criteria for drug development. The ICH standards for type 2 diabetes drug development included the following requirements for patient exposure to investigational products to satisfy new drug application requirements:
- 1,500 individuals total (including single-dose exposure)
- 300–600 patients for 6 months
- 100 patients for 1 year.
Thus, just 250 patient-years of exposure were needed for approval of a drug that patients might take for decades. These standards were unlikely to reveal rare, serious complications and had no ability to assess clinical outcomes efficacy for either microvascular or macrovascular disease complications.
Since 1995, when metformin was approved in the United States, a new class of antihyperglycemic medication has been approved about once every 2 years, so that by 2008, 12 classes of medications had become available for the treatment of type 2 diabetes. This extraordinary rate of drug development has now yielded more classes of medications to treat type 2 diabetes than we presently have for the treatment of hypertension.
This proliferation of new treatments resolved much of the pressure of the unmet medical need, over a period of increasing awareness of the cardiovascular complications of type 2 diabetes, along with numerous examples of adverse cardiovascular effects observed with some of the drugs. In this context, the FDA (and in parallel the European Medicines Agency) made paradigm-shifting changes in the requirements for the development of new type 2 diabetes drugs, requiring large-scale randomized clinical outcome data to assess cardiovascular safety of the new drugs. In December 2008, the FDA published a Guidance for Industry,1 recommending that sponsors of new drugs for type 2 diabetes demonstrate that therapy would not only improve glucose control, but also that it would, at a minimum, not result in an unacceptable increase in cardiovascular risk.1 To better assess new diabetes drugs, the requirement for patient-years of exposure to the studied drug was increased by over 60-fold from 250 patient-years to more than 15,000.
INCRETIN MODULATORS
The incretin system, a regulator of postprandial glucose metabolism, is an attractive target for glycemic control, as it promotes early satiety and lowers blood glucose.
After a meal, endocrine cells in the distal small intestine secrete the incretin hormones GLP-1 and gastric inhibitory polypeptide (GIP), among others, which reduce gastric motility, stimulate the pancreas to augment glucose-appropriate insulin secretion, and decrease postprandial glucagon release. GLP-1 also interacts with the satiety center of the hypothalamus, suppressing appetite. GLP-1 and GIP are rapidly inactivated by the circulating protease DPP-4. Injectable formulations of GLP-1 receptor agonists that are resistant to DPP-4 degradation have been developed.
Ten incretin modulators are now available in the United States. The 4 available DPP-4 inhibitors are all once-daily oral medications, and the 6 GLP-1 receptor agonists are all injectable (Table 1).
Small studies in humans and animals suggest that DPP-4 inhibitors and GLP-1-receptor agonists may have multiple favorable effects on the cardiovascular system independent of their glycemic effects. These include reducing myocardial infarct size,2–5 improving endothelial function,6 reducing inflammation and oxidative stress,7 reducing atherosclerotic plaque volume,8 improving left ventricular function, 9,10 and lowering triglyceride levels.11 However, large clinical trials are needed to determine clinical effectiveness.
DPP-4 INHIBITORS: NOT INFERIOR TO PLACEBO
Saxagliptin
Saxagliptin, a DPP-4 inhibitor, was found in a meta-analysis of phase 2B and early phase 3 trial data involving almost 5,000 patients to be associated with a dramatic 56% relative risk reduction in cardiovascular death, heart attack, and stroke. However, this analysis was limited by the extremely low number of events to analyze, with only 41 total patients with cardiovascular events in that dataset.12
The SAVOR-TIMI 53 trial13 subsequently compared saxagliptin and placebo in a randomized, double-blind trial conducted in 26 countries with nearly 16,500 patients with type 2 diabetes. All patients continued their conventional diabetes treatment at the discretion of their physicians.
During an average follow-up of 2 years, 1,222 events of cardiovascular death, myocardial infarction, or stroke occurred. No significant difference in event rates was found between the saxagliptin and placebo groups. This did not demonstrate the expected cardiovascular benefit based on prior meta-analysis of phase 2B and phase 3 data presented above, but saxagliptin did not increase cardiovascular risk and was the first diabetes drug to earn this distinction of robustly statistically proven cardiovascular safety.
Further analysis of the SAVOR-TIMI 53 trial data revealed a 27% increased relative risk of heart failure hospitalization with saxagliptin compared with placebo.14 Although the risk was statistically significant, the absolute difference in heart failure incidence between the drug and placebo groups was only 0.7% (3.5% vs 2.8%, respectively). As the average follow-up in the trial was 2 years, the absolute incremental risk of heart failure seen with saxagliptin is 0.35% annually—almost identical in magnitude to the increased heart failure risk with pioglitazone. The increased risk of heart failure was seen within the first 6 months of the trial and persisted throughout the trial, indicating an increased up-front risk of heart failure.
Alogliptin
The EXAMINE trial15 compared the DPP-4 inhibitor alogliptin and placebo in 5,380 patients with type 2 diabetes who had had a recent acute coronary event.15 Over the 30 months of the trial, more than 600 primary outcome events of cardiovascular death, myocardial infarction, or stroke occurred, with no significant difference between drug and placebo groups with established nominal statistical noninferiority. A numerically higher incidence of heart failure was noted in patients who received alogliptin than with placebo, but the difference was not statistically significant.16 However, this study was not powered to detect such an increased risk. In patients entering the trial with no history of heart failure, the risk of hospitalization for heart failure was 76% higher in the alogliptin group than in the placebo group, with a nominally significant P value less than .05 in this subgroup.
These analyses led the FDA in 2016 to mandate label warnings for saxagliptin and alogliptin regarding the increased risk of heart failure.17
Sitagliptin
The TECOS trial18 tested the DPP-4 inhibitor sitagliptin and, unlike the SAVOR or EXAMINE trials, included hospitalization for unstable angina in the composite end point. Nearly 15,000 patients with type 2 diabetes and established cardiovascular disease were enrolled, and almost 2,500 events occurred. No significant difference was found between the 2 groups.
In a series of analyses prospectively planned, sitagliptin was not associated with an increased risk of hospitalization for heart failure.19 But despite these robust analyses demonstrating no incremental heart failure risk with sitagliptin, in August 2017, the US product label for sitagliptin was modified to include a warning that other DPP-4 inhibitors have been associated with heart failure and to suggest caution. The label for linagliptin had the same FDA-required changes, with no data yet available from outcomes trials with linagliptin.
GLP-1 RECEPTOR AGONISTS
Lixisenatide: Noninferior to placebo
The ELIXA trial20 assessed the cardiovascular safety of the GLP-1 receptor agonist lixisenatide in patients with type 2 diabetes who recently had an acute coronary event. The study enrolled 6,068 patients from 49 countries, and nearly 1,000 events (cardiovascular death, myocardial infarction, stroke, or unstable angina) occurred during the median 25 months of the study. Results showed lixisenatide did not increase or decrease cardiovascular events or adverse events when compared with placebo.
Liraglutide: Evidence of benefit
The LEADER trial21 randomized 9,340 patients with or at increased risk for cardiovascular disease to receive the injectable GLP-1 receptor agonist liraglutide or placebo. After a median of 3.8 years of follow-up, liraglutide use was associated with a statistically significant 13% relative reduction in major adverse cardiovascular events, mostly driven by a 22% reduction in cardiovascular death.
Semaglutide: Evidence of benefit
The SUSTAIN-6 trial22 found a statistically significant 26% relative risk reduction in cardiovascular outcomes comparing once-weekly semaglutide (an injectable GLP-1 receptor agonist) and placebo in 3,297 patients with type 2 diabetes and established cardiovascular disease, chronic kidney disease, or risk factors for cardiovascular disease. The significant reduction in the incidence of nonfatal stroke with semaglutide was the main driver of the observed benefit.
Taspoglutide: Development halted
Taspoglutide was a candidate GLP-1 receptor agonist that underwent clinical trials for cardiovascular outcomes planned to involve about 8,000 patients. The trials were stopped early and drug development was halted after about 600 patient-years of exposure because of antibody formation in about half of patients exposed to taspoglutide, with anaphylactoid reactions and anaphylaxis reported.23
SGLT-2 INHIBITORS
The renal glomeruli filter about 180 g of glucose every day in normal adults; nearly all of it is reabsorbed by SGLT-2 in the proximal tubules, so that very little glucose is excreted in the urine.24–26 The benign condition hereditary glucosuria occurs due to loss-of-function mutations in the gene for SGLT-2. Individuals with this condition rarely if ever develop type 2 diabetes or obesity, and this observation led pharmaceutical researchers to probe SGLT-2 as a therapeutic target.
Inhibitors of SGLT-2 block glucose reabsorption in the renal proximal tubules and lead to glucosuria. Patients treated with an SGLT-2 inhibitor have lower serum glucose levels and lose weight. Inhibitors also reduce sodium reabsorption via SGLT-2 and lead to increased sodium excretion and decreased blood pressure.27
Three SGLT-2 antagonists are available in the United States: canagliflozin, dapagliflozin, and empagliflozin (Table 1). Ertugliflozin is currently in a phase 3B trial, and cardiovascular outcomes trials are in the planning phase for sotagliflozin, a dual SGLT-1/SGLT-2 inhibitor with SGLT-1 localized to the gastrointestinal tract.28
Empaglifozin: Evidence of benefit
The EMPA-REG OUTCOME trial29 randomized more than 7,200 patients with type 2 diabetes and atherosclerotic vascular disease to receive the SGLT-2 inhibitor empagliflozin or placebo as once-daily tablets, with both groups receiving off-study treatment for glycemic control at the discretion of their own care providers. Two doses of empagliflozin were evaluated in the trial (10 and 25 mg per day), with the 2 dosing groups pooled for all analyses as prospectively planned.
Patients taking empagliflozin had a 14% relative risk reduction of the composite outcome (cardiovascular death, myocardial infarction, and stroke) vs placebo, with no difference in effect between the 2 randomized doses. The improvement in the composite outcome was seen early in the empagliflozin group and persisted for the 4 years of the study.
This was the first trial of newly developed diabetes drugs that showed a statistically significant reduction in cardiovascular risk. The study revealed a 38% relative risk reduction in cardiovascular death in the treatment group. The risk reduction occurred early in the trial and improved throughout the duration of the study. This is a dramatic finding, unequaled even in trials of drugs that specifically target cardiovascular disease. Both doses of empagliflozin studied provided similar benefit over placebo, reinforcing the validity of the findings. Interestingly, in the empagliflozin group, there was a 35% relative risk reduction in heart failure hospitalizations.
Canaglifozin: Evidence of benefit
The CANVAS Program consisted of two sister trials, CANVAS and CANVAS-R, and examined the safety and efficacy of canagliflozin.30 More than 10,000 participants with type 2 diabetes and atherosclerotic disease or at increased risk of cardiovascular disease were randomized to receive canagliflozin or placebo. Canagliflozin led to a 14% relative risk reduction in the composite outcome of cardiovascular death, nonfatal myocardial infarction, or nonfatal stroke, but there was a statistically significant doubling in the incidence of amputations. Unlike empagliflozin, canagliflozin did not demonstrate a significant reduction in death from cardiovascular causes, suggesting that this may not be a class effect of SGLT-2 inhibitors. As with empagliflozin, canagliflozin led to a 33% relative risk reduction in heart failure hospitalizations.
Cardiovascular benefits independent of glucose-lowering
The cardiovascular benefits of empagliflozin in EMPA-REG OUTCOME and canagliflozin in CANVAS were observed early, suggesting that the mechanism may be due to the direct effects on the cardiovascular system rather than glycemic modification.
Improved glycemic control with the SGLT-2 inhibitor was seen early in both studies, but with the trials designed for glycemic equipoise encouraging open-label therapy targeting hemoglobin A1c to standard-of-care targets in both groups, the contrast in hemoglobin A1c between groups diminished throughout the trial after its first assessment. Although hemoglobin A1c levels in the SGLT-2 inhibitor groups decreased in the first 12 weeks, they increased over time nearly to the level seen in the placebo group. The adjusted mean hemoglobin A1c level in the placebo groups remained near 8.0% throughout the studies, a target consistent with guidelines from the American Diabetes Association and the European Association for the Study of Diabetes31 for the high-risk populations recruited and enrolled.
Blood pressure reduction and weight loss do not explain cardiovascular benefits
SGLT-2 inhibitors lower blood pressure independent of their diuretic effects. In the EMPA-REG OUTCOME trial, the adjusted mean systolic blood pressure was 3 to 4 mm Hg lower in the treatment groups than in the placebo group throughout the trial.29 This level of blood pressure lowering translates to an estimated 10% to 12% relative risk reduction for major adverse cardiovascular events, including heart failure. Although the risk reduction from blood pressure lowering is not insignificant, it does not explain the 38% reduction in cardiovascular deaths seen in the trial. Canagliflozin led to a similar 4-mm Hg reduction in systolic pressure compared with the placebo group.30
Weight loss was seen with both empagliflozin and canagliflozin but was not dramatic and is unlikely to account for the described cardiovascular benefits.
Theories of cardiovascular benefit
Several mechanisms have been proposed to help explain the observed cardiovascular benefits of SGLT-2 inhibitors.32
Ketone-body elevation. Ferrannini et al33 found that the blood concentration of the ketone-body beta-hydroxybutyrate is about twice as high in patients with type 2 diabetes in the fasting state who are chronically taking empagliflozin as in patients not receiving the drug. Beta-hydroxybutyrate levels peak after a meal and then return to baseline over several hours before rising again during the fasting period. Although the ketone elevation is not nearly as extreme as in diabetic ketoacidosis (about a 1,000-fold increase), the observed increase may reduce myocardial oxygen demand, as beta-hydroxybutyrate is among the most efficient metabolic substrates for the myocardium.
Red blood cell expansion. Perhaps a more likely explanation of the cardiovascular benefit seen with SGLT-2 inhibitor therapy is the increase in hemoglobin and hematocrit levels. At first attributed to hemoconcentration secondary to diuresis, this has been disproven by a number of studies. The EMPA-REG OUTCOME trial29 found that within 12 weeks of exposure to empagliflozin, hematocrit levels rose nearly 4% absolutely compared with the levels in the placebo group. This increase is equivalent to transfusing a unit of red blood cells, favorably affecting myocardial oxygen supply.
Reduction in glomerular hypertension. The kidneys regulate glomerular filtration in a process involving the macula densa, an area of specialized cells in the juxtaglomerular apparatus in the loop of Henle that responds to sodium concentration in the urine. Normally, SGLT-2 receptors upstream from the loop of Henle reabsorb sodium and glucose into the bloodstream, reducing sodium delivery to the macula densa, which senses this as a low-volume state. The macula densa cells respond by releasing factors that dilate afferent arterioles and increase glomerular filtration. People with diabetes have more glucose to reabsorb and therefore also reabsorb more sodium, leading to glomerular hypertension.
SGLT-2 inhibitors block both glucose and sodium reuptake at SGLT-2 receptors, normalizing the response at the macula densa, restoring a normal glomerular filtration rate, and alleviating glomerular hypertension. As the kidney perceives a more normal volume status, renin-angiotensin-aldosterone stimulation is attenuated and sympathetic nervous system activity improves.27,34 If this model of SGLT-2 inhibitor effects on the kidney is correct, these drugs have similar effects as angiotensin-converting enzyme (ACE) inhibitors, angiotensin II receptor blockers (ARBs), mineralocorticoid antagonists, and beta-blockers combined.
Kidney benefits
Empagliflozin35 and canagliflozin30 both reduced the rate of progression of kidney dysfunction and led to fewer clinically relevant renal events compared with placebo. Treatment and placebo groups also received standard care, so many patients were treated with renin-angiotensin-aldosterone system inhibitors and with good blood pressure control, making the finding that SGLT-2 inhibitors had a significant beneficial effect even more dramatic. Beneficial effects on markers of kidney function were seen early on, suggesting a more favorable hemodynamic effect on the kidney rather than improved glycemic control attenuating microvascular disease.
Empagliflozin approved to reduce clinical events
In December 2016, the FDA approved the indication for empagliflozin to reduce the risk of cardiovascular death in patients with type 2 diabetes,36 the first-ever clinical outcome indication for a type 2 diabetes medication. The European Society of Cardiology guidelines now include empagliflozin as preferred therapy for type 2 diabetes, recommending it to prevent the onset of heart failure and prolong life.37 This recommendation goes beyond the evidence from the EMPA-REG OUTCOME trial on which it is based, as the trial only studied patients with known atherosclerotic vascular disease.
The 2016 European Guidelines on cardiovascular disease prevention also recommend that an SGLT-2 inhibitor be considered early for patients with type 2 diabetes and cardiovascular disease to reduce cardiovascular and total mortality.38 The American Diabetes Association in their 2017 guidelines also endorse empagliflozin for treating patients with type 2 diabetes and cardiovascular disease.39 The fact that the American Diabetes Association recommendation is not based on glycemic control, in line with the product-labeled indication, is a major shift in the association’s guidance.
Cautions with SGLT-2 inhibitors
- Use SGLT-2 inhibitors in patients with low blood pressure with caution, and with increased blood pressure monitoring just following initiation.
- Consider modifying antihypertensive drugs in patients with labile blood pressure.
- Consider stopping or reducing background diuretics when starting an SGLT-2 inhibitor, and reassess volume status after 1 to 2 weeks.
- For patients on insulin, sulfonylureas, or both, consider decreasing dosages when starting an SGLT-2 inhibitor, and reassess glycemic control periodically.
- Counsel patients about urinary hygiene. Although bacterial urinary tract infections have not emerged as a problem, fungal genital infections have, particularly in women and uncircumcised men.
- Consider SGLT-2 inhibitors to be “sick-day” medications. Patients with diabetes must adjust their diabetes medications if their oral intake is reduced for a day or more, such as while sick or fasting. SGLT-2 inhibitors should not be taken on these days. Cases of diabetic ketoacidosis have arisen in patients who reduced oral intake while continuing their SGLT-2 inhibitor.
OTHER DRUGS WITH DEVELOPMENT HALTED
Aleglitazar, a peroxisome proliferator-activated receptor agonist taken orally once daily, raised high expectations when it was found in early studies to lower serum triglycerides and raise high-density lipoprotein cholesterol levels in addition to lowering blood glucose. However, a phase 3 trial in more than 7,000 patients was terminated after a median follow up of 2 years because of increased rates of heart failure, worsened kidney function, bone fractures, and gastrointestinal bleeding.40 Development of this drug was stopped.
Fasiglifam, a G-protein-coupled receptor 40 agonist, was tested in a cardiovascular clinical outcomes trial. Compared with placebo, fasiglifam reduced hemoglobin A1c levels with low risk of hypoglycemia.41 However, safety concerns about increased liver enzyme levels led to the cessation of the drug’s development.42
HOW WILL THIS AFFECT DIABETES MANAGEMENT?
Metformin is still the most commonly prescribed drug for type 2 diabetes but has only marginal evidence for its cardiovascular benefits and may not be the first-line therapy for the management of diabetes in the future. In the EMPA REG OUTCOME, LEADER, and SUSTAIN-6 trials, the novel diabetes medications were given to patients who were already treated with available therapies, often including metformin. Treatment with empagliflozin, liraglutide, and semaglutide may be indicated for patients with diabetes and atherosclerotic vascular disease as first-line therapies in the future.
SGLT-2 inhibitor therapy can cost about $500 per month, and GLP-1 inhibitors are only slightly less expensive. The cost may be prohibitive for many patients. As more evidence, guidelines, and FDA criteria support the use of these novel diabetes drugs, third-party payers and pharmaceutical companies may be motivated to lower costs to help reach more patients who can benefit from these therapies.
Since 2008, the US Food and Drug Administration (FDA) has required new diabetes drugs to demonstrate cardiovascular safety, resulting in large and lengthy clinical trials. Under the new regulations, several dipeptidyl peptidase-4 (DPP-4) inhibitors, sodium-glucose cotransporter-2 (SGLT-2) inhibitors, and glucagon-like peptide-1 (GLP-1) receptor agonists have demonstrated cardiovascular safety, with some demonstrating superior cardiovascular efficacy. In 2016, the SGLT-2 inhibitor empagliflozin became the first (and as of this writing, the only) diabetes drug approved by the FDA for a clinical outcome indication, ie, to reduce the risk of cardiovascular death.
DIABETES DRUG DEVELOPMENT
Changing priorities
The International Council for Harmonization of Technical Requirements for Pharmaceuticals for Human Use (ICH) was formed in 1990 as a collaborative effort across global regulatory agencies and coordinated by the World Health Organization to universalize criteria for drug development. The ICH standards for type 2 diabetes drug development included the following requirements for patient exposure to investigational products to satisfy new drug application requirements:
- 1,500 individuals total (including single-dose exposure)
- 300–600 patients for 6 months
- 100 patients for 1 year.
Thus, just 250 patient-years of exposure were needed for approval of a drug that patients might take for decades. These standards were unlikely to reveal rare, serious complications and had no ability to assess clinical outcomes efficacy for either microvascular or macrovascular disease complications.
Since 1995, when metformin was approved in the United States, a new class of antihyperglycemic medication has been approved about once every 2 years, so that by 2008, 12 classes of medications had become available for the treatment of type 2 diabetes. This extraordinary rate of drug development has now yielded more classes of medications to treat type 2 diabetes than we presently have for the treatment of hypertension.
This proliferation of new treatments resolved much of the pressure of the unmet medical need, over a period of increasing awareness of the cardiovascular complications of type 2 diabetes, along with numerous examples of adverse cardiovascular effects observed with some of the drugs. In this context, the FDA (and in parallel the European Medicines Agency) made paradigm-shifting changes in the requirements for the development of new type 2 diabetes drugs, requiring large-scale randomized clinical outcome data to assess cardiovascular safety of the new drugs. In December 2008, the FDA published a Guidance for Industry,1 recommending that sponsors of new drugs for type 2 diabetes demonstrate that therapy would not only improve glucose control, but also that it would, at a minimum, not result in an unacceptable increase in cardiovascular risk.1 To better assess new diabetes drugs, the requirement for patient-years of exposure to the studied drug was increased by over 60-fold from 250 patient-years to more than 15,000.
INCRETIN MODULATORS
The incretin system, a regulator of postprandial glucose metabolism, is an attractive target for glycemic control, as it promotes early satiety and lowers blood glucose.
After a meal, endocrine cells in the distal small intestine secrete the incretin hormones GLP-1 and gastric inhibitory polypeptide (GIP), among others, which reduce gastric motility, stimulate the pancreas to augment glucose-appropriate insulin secretion, and decrease postprandial glucagon release. GLP-1 also interacts with the satiety center of the hypothalamus, suppressing appetite. GLP-1 and GIP are rapidly inactivated by the circulating protease DPP-4. Injectable formulations of GLP-1 receptor agonists that are resistant to DPP-4 degradation have been developed.
Ten incretin modulators are now available in the United States. The 4 available DPP-4 inhibitors are all once-daily oral medications, and the 6 GLP-1 receptor agonists are all injectable (Table 1).
Small studies in humans and animals suggest that DPP-4 inhibitors and GLP-1-receptor agonists may have multiple favorable effects on the cardiovascular system independent of their glycemic effects. These include reducing myocardial infarct size,2–5 improving endothelial function,6 reducing inflammation and oxidative stress,7 reducing atherosclerotic plaque volume,8 improving left ventricular function, 9,10 and lowering triglyceride levels.11 However, large clinical trials are needed to determine clinical effectiveness.
DPP-4 INHIBITORS: NOT INFERIOR TO PLACEBO
Saxagliptin
Saxagliptin, a DPP-4 inhibitor, was found in a meta-analysis of phase 2B and early phase 3 trial data involving almost 5,000 patients to be associated with a dramatic 56% relative risk reduction in cardiovascular death, heart attack, and stroke. However, this analysis was limited by the extremely low number of events to analyze, with only 41 total patients with cardiovascular events in that dataset.12
The SAVOR-TIMI 53 trial13 subsequently compared saxagliptin and placebo in a randomized, double-blind trial conducted in 26 countries with nearly 16,500 patients with type 2 diabetes. All patients continued their conventional diabetes treatment at the discretion of their physicians.
During an average follow-up of 2 years, 1,222 events of cardiovascular death, myocardial infarction, or stroke occurred. No significant difference in event rates was found between the saxagliptin and placebo groups. This did not demonstrate the expected cardiovascular benefit based on prior meta-analysis of phase 2B and phase 3 data presented above, but saxagliptin did not increase cardiovascular risk and was the first diabetes drug to earn this distinction of robustly statistically proven cardiovascular safety.
Further analysis of the SAVOR-TIMI 53 trial data revealed a 27% increased relative risk of heart failure hospitalization with saxagliptin compared with placebo.14 Although the risk was statistically significant, the absolute difference in heart failure incidence between the drug and placebo groups was only 0.7% (3.5% vs 2.8%, respectively). As the average follow-up in the trial was 2 years, the absolute incremental risk of heart failure seen with saxagliptin is 0.35% annually—almost identical in magnitude to the increased heart failure risk with pioglitazone. The increased risk of heart failure was seen within the first 6 months of the trial and persisted throughout the trial, indicating an increased up-front risk of heart failure.
Alogliptin
The EXAMINE trial15 compared the DPP-4 inhibitor alogliptin and placebo in 5,380 patients with type 2 diabetes who had had a recent acute coronary event.15 Over the 30 months of the trial, more than 600 primary outcome events of cardiovascular death, myocardial infarction, or stroke occurred, with no significant difference between drug and placebo groups with established nominal statistical noninferiority. A numerically higher incidence of heart failure was noted in patients who received alogliptin than with placebo, but the difference was not statistically significant.16 However, this study was not powered to detect such an increased risk. In patients entering the trial with no history of heart failure, the risk of hospitalization for heart failure was 76% higher in the alogliptin group than in the placebo group, with a nominally significant P value less than .05 in this subgroup.
These analyses led the FDA in 2016 to mandate label warnings for saxagliptin and alogliptin regarding the increased risk of heart failure.17
Sitagliptin
The TECOS trial18 tested the DPP-4 inhibitor sitagliptin and, unlike the SAVOR or EXAMINE trials, included hospitalization for unstable angina in the composite end point. Nearly 15,000 patients with type 2 diabetes and established cardiovascular disease were enrolled, and almost 2,500 events occurred. No significant difference was found between the 2 groups.
In a series of analyses prospectively planned, sitagliptin was not associated with an increased risk of hospitalization for heart failure.19 But despite these robust analyses demonstrating no incremental heart failure risk with sitagliptin, in August 2017, the US product label for sitagliptin was modified to include a warning that other DPP-4 inhibitors have been associated with heart failure and to suggest caution. The label for linagliptin had the same FDA-required changes, with no data yet available from outcomes trials with linagliptin.
GLP-1 RECEPTOR AGONISTS
Lixisenatide: Noninferior to placebo
The ELIXA trial20 assessed the cardiovascular safety of the GLP-1 receptor agonist lixisenatide in patients with type 2 diabetes who recently had an acute coronary event. The study enrolled 6,068 patients from 49 countries, and nearly 1,000 events (cardiovascular death, myocardial infarction, stroke, or unstable angina) occurred during the median 25 months of the study. Results showed lixisenatide did not increase or decrease cardiovascular events or adverse events when compared with placebo.
Liraglutide: Evidence of benefit
The LEADER trial21 randomized 9,340 patients with or at increased risk for cardiovascular disease to receive the injectable GLP-1 receptor agonist liraglutide or placebo. After a median of 3.8 years of follow-up, liraglutide use was associated with a statistically significant 13% relative reduction in major adverse cardiovascular events, mostly driven by a 22% reduction in cardiovascular death.
Semaglutide: Evidence of benefit
The SUSTAIN-6 trial22 found a statistically significant 26% relative risk reduction in cardiovascular outcomes comparing once-weekly semaglutide (an injectable GLP-1 receptor agonist) and placebo in 3,297 patients with type 2 diabetes and established cardiovascular disease, chronic kidney disease, or risk factors for cardiovascular disease. The significant reduction in the incidence of nonfatal stroke with semaglutide was the main driver of the observed benefit.
Taspoglutide: Development halted
Taspoglutide was a candidate GLP-1 receptor agonist that underwent clinical trials for cardiovascular outcomes planned to involve about 8,000 patients. The trials were stopped early and drug development was halted after about 600 patient-years of exposure because of antibody formation in about half of patients exposed to taspoglutide, with anaphylactoid reactions and anaphylaxis reported.23
SGLT-2 INHIBITORS
The renal glomeruli filter about 180 g of glucose every day in normal adults; nearly all of it is reabsorbed by SGLT-2 in the proximal tubules, so that very little glucose is excreted in the urine.24–26 The benign condition hereditary glucosuria occurs due to loss-of-function mutations in the gene for SGLT-2. Individuals with this condition rarely if ever develop type 2 diabetes or obesity, and this observation led pharmaceutical researchers to probe SGLT-2 as a therapeutic target.
Inhibitors of SGLT-2 block glucose reabsorption in the renal proximal tubules and lead to glucosuria. Patients treated with an SGLT-2 inhibitor have lower serum glucose levels and lose weight. Inhibitors also reduce sodium reabsorption via SGLT-2 and lead to increased sodium excretion and decreased blood pressure.27
Three SGLT-2 antagonists are available in the United States: canagliflozin, dapagliflozin, and empagliflozin (Table 1). Ertugliflozin is currently in a phase 3B trial, and cardiovascular outcomes trials are in the planning phase for sotagliflozin, a dual SGLT-1/SGLT-2 inhibitor with SGLT-1 localized to the gastrointestinal tract.28
Empaglifozin: Evidence of benefit
The EMPA-REG OUTCOME trial29 randomized more than 7,200 patients with type 2 diabetes and atherosclerotic vascular disease to receive the SGLT-2 inhibitor empagliflozin or placebo as once-daily tablets, with both groups receiving off-study treatment for glycemic control at the discretion of their own care providers. Two doses of empagliflozin were evaluated in the trial (10 and 25 mg per day), with the 2 dosing groups pooled for all analyses as prospectively planned.
Patients taking empagliflozin had a 14% relative risk reduction of the composite outcome (cardiovascular death, myocardial infarction, and stroke) vs placebo, with no difference in effect between the 2 randomized doses. The improvement in the composite outcome was seen early in the empagliflozin group and persisted for the 4 years of the study.
This was the first trial of newly developed diabetes drugs that showed a statistically significant reduction in cardiovascular risk. The study revealed a 38% relative risk reduction in cardiovascular death in the treatment group. The risk reduction occurred early in the trial and improved throughout the duration of the study. This is a dramatic finding, unequaled even in trials of drugs that specifically target cardiovascular disease. Both doses of empagliflozin studied provided similar benefit over placebo, reinforcing the validity of the findings. Interestingly, in the empagliflozin group, there was a 35% relative risk reduction in heart failure hospitalizations.
Canaglifozin: Evidence of benefit
The CANVAS Program consisted of two sister trials, CANVAS and CANVAS-R, and examined the safety and efficacy of canagliflozin.30 More than 10,000 participants with type 2 diabetes and atherosclerotic disease or at increased risk of cardiovascular disease were randomized to receive canagliflozin or placebo. Canagliflozin led to a 14% relative risk reduction in the composite outcome of cardiovascular death, nonfatal myocardial infarction, or nonfatal stroke, but there was a statistically significant doubling in the incidence of amputations. Unlike empagliflozin, canagliflozin did not demonstrate a significant reduction in death from cardiovascular causes, suggesting that this may not be a class effect of SGLT-2 inhibitors. As with empagliflozin, canagliflozin led to a 33% relative risk reduction in heart failure hospitalizations.
Cardiovascular benefits independent of glucose-lowering
The cardiovascular benefits of empagliflozin in EMPA-REG OUTCOME and canagliflozin in CANVAS were observed early, suggesting that the mechanism may be due to the direct effects on the cardiovascular system rather than glycemic modification.
Improved glycemic control with the SGLT-2 inhibitor was seen early in both studies, but with the trials designed for glycemic equipoise encouraging open-label therapy targeting hemoglobin A1c to standard-of-care targets in both groups, the contrast in hemoglobin A1c between groups diminished throughout the trial after its first assessment. Although hemoglobin A1c levels in the SGLT-2 inhibitor groups decreased in the first 12 weeks, they increased over time nearly to the level seen in the placebo group. The adjusted mean hemoglobin A1c level in the placebo groups remained near 8.0% throughout the studies, a target consistent with guidelines from the American Diabetes Association and the European Association for the Study of Diabetes31 for the high-risk populations recruited and enrolled.
Blood pressure reduction and weight loss do not explain cardiovascular benefits
SGLT-2 inhibitors lower blood pressure independent of their diuretic effects. In the EMPA-REG OUTCOME trial, the adjusted mean systolic blood pressure was 3 to 4 mm Hg lower in the treatment groups than in the placebo group throughout the trial.29 This level of blood pressure lowering translates to an estimated 10% to 12% relative risk reduction for major adverse cardiovascular events, including heart failure. Although the risk reduction from blood pressure lowering is not insignificant, it does not explain the 38% reduction in cardiovascular deaths seen in the trial. Canagliflozin led to a similar 4-mm Hg reduction in systolic pressure compared with the placebo group.30
Weight loss was seen with both empagliflozin and canagliflozin but was not dramatic and is unlikely to account for the described cardiovascular benefits.
Theories of cardiovascular benefit
Several mechanisms have been proposed to help explain the observed cardiovascular benefits of SGLT-2 inhibitors.32
Ketone-body elevation. Ferrannini et al33 found that the blood concentration of the ketone-body beta-hydroxybutyrate is about twice as high in patients with type 2 diabetes in the fasting state who are chronically taking empagliflozin as in patients not receiving the drug. Beta-hydroxybutyrate levels peak after a meal and then return to baseline over several hours before rising again during the fasting period. Although the ketone elevation is not nearly as extreme as in diabetic ketoacidosis (about a 1,000-fold increase), the observed increase may reduce myocardial oxygen demand, as beta-hydroxybutyrate is among the most efficient metabolic substrates for the myocardium.
Red blood cell expansion. Perhaps a more likely explanation of the cardiovascular benefit seen with SGLT-2 inhibitor therapy is the increase in hemoglobin and hematocrit levels. At first attributed to hemoconcentration secondary to diuresis, this has been disproven by a number of studies. The EMPA-REG OUTCOME trial29 found that within 12 weeks of exposure to empagliflozin, hematocrit levels rose nearly 4% absolutely compared with the levels in the placebo group. This increase is equivalent to transfusing a unit of red blood cells, favorably affecting myocardial oxygen supply.
Reduction in glomerular hypertension. The kidneys regulate glomerular filtration in a process involving the macula densa, an area of specialized cells in the juxtaglomerular apparatus in the loop of Henle that responds to sodium concentration in the urine. Normally, SGLT-2 receptors upstream from the loop of Henle reabsorb sodium and glucose into the bloodstream, reducing sodium delivery to the macula densa, which senses this as a low-volume state. The macula densa cells respond by releasing factors that dilate afferent arterioles and increase glomerular filtration. People with diabetes have more glucose to reabsorb and therefore also reabsorb more sodium, leading to glomerular hypertension.
SGLT-2 inhibitors block both glucose and sodium reuptake at SGLT-2 receptors, normalizing the response at the macula densa, restoring a normal glomerular filtration rate, and alleviating glomerular hypertension. As the kidney perceives a more normal volume status, renin-angiotensin-aldosterone stimulation is attenuated and sympathetic nervous system activity improves.27,34 If this model of SGLT-2 inhibitor effects on the kidney is correct, these drugs have similar effects as angiotensin-converting enzyme (ACE) inhibitors, angiotensin II receptor blockers (ARBs), mineralocorticoid antagonists, and beta-blockers combined.
Kidney benefits
Empagliflozin35 and canagliflozin30 both reduced the rate of progression of kidney dysfunction and led to fewer clinically relevant renal events compared with placebo. Treatment and placebo groups also received standard care, so many patients were treated with renin-angiotensin-aldosterone system inhibitors and with good blood pressure control, making the finding that SGLT-2 inhibitors had a significant beneficial effect even more dramatic. Beneficial effects on markers of kidney function were seen early on, suggesting a more favorable hemodynamic effect on the kidney rather than improved glycemic control attenuating microvascular disease.
Empagliflozin approved to reduce clinical events
In December 2016, the FDA approved the indication for empagliflozin to reduce the risk of cardiovascular death in patients with type 2 diabetes,36 the first-ever clinical outcome indication for a type 2 diabetes medication. The European Society of Cardiology guidelines now include empagliflozin as preferred therapy for type 2 diabetes, recommending it to prevent the onset of heart failure and prolong life.37 This recommendation goes beyond the evidence from the EMPA-REG OUTCOME trial on which it is based, as the trial only studied patients with known atherosclerotic vascular disease.
The 2016 European Guidelines on cardiovascular disease prevention also recommend that an SGLT-2 inhibitor be considered early for patients with type 2 diabetes and cardiovascular disease to reduce cardiovascular and total mortality.38 The American Diabetes Association in their 2017 guidelines also endorse empagliflozin for treating patients with type 2 diabetes and cardiovascular disease.39 The fact that the American Diabetes Association recommendation is not based on glycemic control, in line with the product-labeled indication, is a major shift in the association’s guidance.
Cautions with SGLT-2 inhibitors
- Use SGLT-2 inhibitors in patients with low blood pressure with caution, and with increased blood pressure monitoring just following initiation.
- Consider modifying antihypertensive drugs in patients with labile blood pressure.
- Consider stopping or reducing background diuretics when starting an SGLT-2 inhibitor, and reassess volume status after 1 to 2 weeks.
- For patients on insulin, sulfonylureas, or both, consider decreasing dosages when starting an SGLT-2 inhibitor, and reassess glycemic control periodically.
- Counsel patients about urinary hygiene. Although bacterial urinary tract infections have not emerged as a problem, fungal genital infections have, particularly in women and uncircumcised men.
- Consider SGLT-2 inhibitors to be “sick-day” medications. Patients with diabetes must adjust their diabetes medications if their oral intake is reduced for a day or more, such as while sick or fasting. SGLT-2 inhibitors should not be taken on these days. Cases of diabetic ketoacidosis have arisen in patients who reduced oral intake while continuing their SGLT-2 inhibitor.
OTHER DRUGS WITH DEVELOPMENT HALTED
Aleglitazar, a peroxisome proliferator-activated receptor agonist taken orally once daily, raised high expectations when it was found in early studies to lower serum triglycerides and raise high-density lipoprotein cholesterol levels in addition to lowering blood glucose. However, a phase 3 trial in more than 7,000 patients was terminated after a median follow up of 2 years because of increased rates of heart failure, worsened kidney function, bone fractures, and gastrointestinal bleeding.40 Development of this drug was stopped.
Fasiglifam, a G-protein-coupled receptor 40 agonist, was tested in a cardiovascular clinical outcomes trial. Compared with placebo, fasiglifam reduced hemoglobin A1c levels with low risk of hypoglycemia.41 However, safety concerns about increased liver enzyme levels led to the cessation of the drug’s development.42
HOW WILL THIS AFFECT DIABETES MANAGEMENT?
Metformin is still the most commonly prescribed drug for type 2 diabetes but has only marginal evidence for its cardiovascular benefits and may not be the first-line therapy for the management of diabetes in the future. In the EMPA REG OUTCOME, LEADER, and SUSTAIN-6 trials, the novel diabetes medications were given to patients who were already treated with available therapies, often including metformin. Treatment with empagliflozin, liraglutide, and semaglutide may be indicated for patients with diabetes and atherosclerotic vascular disease as first-line therapies in the future.
SGLT-2 inhibitor therapy can cost about $500 per month, and GLP-1 inhibitors are only slightly less expensive. The cost may be prohibitive for many patients. As more evidence, guidelines, and FDA criteria support the use of these novel diabetes drugs, third-party payers and pharmaceutical companies may be motivated to lower costs to help reach more patients who can benefit from these therapies.
- US Food and Drug Administration. Guidance for industry. Diabetes mellitus—evaluating cardiovascular risk in new antidiabetic therapies to treat type 2 diabetes. www.fda.gov/downloads/Drugs/.../Guidances/ucm071627.pdf. Accessed September 1, 2017.
- Ye Y, Keyes KT, Zhang C, Perez-Polo JR, Lin Y, Birnbaum Y. The myocardial infarct size-limiting effect of sitagliptin is PKA-dependent, whereas the protective effect of pioglitazone is partially dependent on PKA. Am J Physiol Heart Circ Physiol 2010; 298:H1454–H1465.
- Hocher B, Sharkovska Y, Mark M, Klein T, Pfab T. The novel DPP-4 inhibitors linagliptin and BI 14361 reduce infarct size after myocardial ischemia/reperfusion in rats. Int J Cardiol 2013; 167:87–93.
- Woo JS, Kim W, Ha SJ, et al. Cardioprotective effects of exenatide in patients with ST-segment-elevation myocardial infarction undergoing primary percutaneous coronary intervention: results of exenatide myocardial protection in revascularization study. Arterioscler Thromb Vasc Biol 2013; 33:2252–2260.
- Lønborg J, Vejlstrup N, Kelbæk H, et al. Exenatide reduces reperfusion injury in patients with ST-segment elevation myocardial infarction. Eur Heart J 2012; 33:1491–1499.
- van Poppel PC, Netea MG, Smits P, Tack CJ. Vildagliptin improves endothelium-dependent vasodilatation in type 2 diabetes. Diabetes Care 2011; 34:2072–2077.
- Kröller-Schön S, Knorr M, Hausding M, et al. Glucose-independent improvement of vascular dysfunction in experimental sepsis by dipeptidyl-peptidase 4 inhibition. Cardiovasc Res 2012; 96:140–149.
- Ta NN, Schuyler CA, Li Y, Lopes-Virella MF, Huang Y. DPP-4 (CD26) inhibitor alogliptin inhibits atherosclerosis in diabetic apolipoprotein E-deficient mice. J Cardiovasc Pharmacol 2011; 58:157–166.
- Sauvé M, Ban K, Momen MA, et al. Genetic deletion or pharmacological inhibition of dipeptidyl peptidase-4 improves cardiovascular outcomes after myocardial infarction in mice. Diabetes 2010; 59:1063–1073.
- Read PA, Khan FZ, Heck PM, Hoole SP, Dutka DP. DPP-4 inhibition by sitagliptin improves the myocardial response to dobutamine stress and mitigates stunning in a pilot study of patients with coronary artery disease. Circ Cardiovasc Imaging 2010; 3:195–201.
- Matikainen N, Mänttäri S, Schweizer A, et al. Vildagliptin therapy reduces postprandial intestinal triglyceride-rich lipoprotein particles in patients with type 2 diabetes. Diabetologia 2006; 49:2049–2057.
- Frederich R, Alexander JH, Fiedorek FT, et al. A systematic assessment of cardiovascular outcomes in the saxagliptin drug development program for type 2 diabetes. Postgrad Med 2010; 122:16–27.
- Scirica BM, Bhatt DL, Braunwald E, et al; SAVOR-TIMI 53 Steering Committee and Investigators. Saxagliptin and cardiovascular outcomes in patients with type 2 diabetes mellitus. N Engl J Med 2013; 369:1317–1326.
- Scirica BM, Braunwald E, Raz I, et al; SAVOR-TIMI 53 Steering Committee and Investigators. Heart failure, saxagliptin, and diabetes mellitus: observations from the SAVOR-TIMI 53 randomized trial. Circulation 2014; 130:1579–1588.
- White WB, Cannon CP, Heller SR, et al; EXAMINE Investigators. Alogliptin after acute coronary syndrome in patients with type 2 diabetes. N Engl J Med 2013; 369:1327–1335.
- Zannad F, Cannon CP, Cushman WC, et al; EXAMINE Investigators. Heart failure and mortality outcomes in patients with type 2 diabetes taking alogliptin versus placebo in EXAMINE: a multicentre, randomised, double-blind trial. Lancet 2015; 385:2067–2076.
- US Food and Drug Administration. Diabetes medications containing saxagliptin and alogliptin: drug safety communication—risk of heart failure. https://www.fda.gov/safety/medwatch/safetyinformation/safetyalertsforhumanmedicalproducts/ucm494252.htm. Accessed August 23, 2017.
- Green JB, Bethel MA, Armstrong PW, et al; TECOS Study Group. Effect of sitagliptin on cardiovascular outcomes in type 2 diabetes. N Engl J Med 2015; 373:232–242.
- McGuire DK, Van de Werf F, Armstrong PW, et al; Trial Evaluating Cardiovascular Outcomes With Sitagliptin (TECOS) Study Group. Association between sitagliptin use and heart failure hospitalization and related outcomes in type 2 diabetes mellitus: secondary analysis of a randomized clinical trial. JAMA Cardiol 2016; 1:126–135.
- Pfeffer MA, Claggett B, Diaz R, et al; ELIXA Investigators. Lixisenatide in patients with type 2 diabetes and acute coronary syndrome. N Engl J Med 2015; 373:2247–2257.
- Marso SP, Daniels GH, Brown-Frandsen K, et al; LEADER Steering Committee; LEADER Trial Investigators. Liraglutide and cardiovascular outcomes in type 2 diabetes. N Engl J Med 2016; 375:311–322.
- Marso SP, Bain SC, Consoli A, et al; SUSTAIN-6 Investigators. Semaglutide and cardiovascular outcomes in patients with type 2 diabetes. N Engl J Med 2016; 375:1834–1844.
- Rosenstock J, Balas B, Charbonnel B, et al; T-EMERGE 2 Study Group. The fate of taspoglutide, a weekly GLP-1 receptor agonist, versus twice-daily exenatide for type 2 diabetes: the T-EMERGE 2 trial. Diabetes Care 2013; 36:498–504.
- Wright EM. Renal Na(+)-glucose cotransporters. Am J Physiol 2001; 280:F10–F18.
- Lee YJ, Lee YJ, Han HJ. Regulatory mechanisms of Na(+)/glucose cotransporters in renal proximal tubule cells. Kidney Int 2007; 72(suppl 106):S27–S35.
- Hummel CS, Lu C, Loo DD, Hirayama BA, Voss AA, Wright EM. Glucose transport by human renal Na+/D-glucose cotransporters SGLT1 and SGLT2. Am J Physiol Cell Physiol 2011; 300:C14–C21.
- Heerspink HJ, Perkins BA, Fitchett DH, Husain M, Cherney DZ. Sodium glucose cotransporter 2 inhibitors in the treatment of diabetes mellitus: cardiovascular and kidney effects, potential mechanisms, and clinical applications. Circulation 2016; 134:752–772.
- Lapuerta P, Zambrowicz, Strumph P, Sands A. Development of sotagliflozin, a dual sodium-dependent glucose transporter 1/2 inhibitor. Diabetes Vasc Dis Res 2015; 12:101–110.
- Zinman B, Wanner C, Lachin JM, et al, for the EMPA-REG OUTCOME Investigators. Empagliflozin, cardiovascular outcomes, and mortality in type 2 diabetes. N Engl J Med 2015; 373:2117–2128.
- Neal B, Vlado-Perkovic V, Mahaffey KW, et al, for the CANVAS Program Collaborative Group. Canagloflozin and cardiovascular and renal events in type 2 diabetes. N Engl J Med 2017; 377:644–657.
- Inzucchi SE, Bergenstal RM, Buse JB, et al. Management of hyperglycemia in type 2 diabetes, 2015: a patient-centered approach: update to a position statement of the American Diabetes Association and the European Association for the Study of Diabetes. Diabetes Care 2015; 38:140–149.
- Verma S, McMurray JJV, Cherney DZI. The metabolodiuretic promise of sodium-dependent glucose cotransporter 2 inhibition: the search for the sweet spot in heart failure. JAMA Cardiol. 2017:2(9):939-940. doi:10.1001/jamacardio.2017.1891.
- Ferrannini E, Mark M, Mayoux E. CV protection in the EMPA-REG OUTCOME trial: a “thrifty substrate” hypothesis. Diabetes Care 2016; 39:1108–1114.
- Cherney DZ, Perkins BA, Soleymanlou N, et al. Renal hemodynamic effect of sodium-glucose cotransporter 2 inhibition in patients with type 1 diabetes mellitus. Circulation 2014; 129:587–597.
- Wanner C, Inzucchi SE, Lachin JM, et al, for the EMPA-REG OUTCOME Investigators. Empagliflozin and progression of kidney disease in type 2 diabetes. N Engl J Med 2016; 375:323–334.
- US Food and Drug Administration. FDA News Release. FDA approves Jardiance to reduce cardiovascular death in adults with type 2 diabetes. https://www.fda.gov/newsevents/newsroom/pressannouncements/ucm531517.htm. Accessed August 23, 2017.
- Ponikowski P, Voors AA, Anker SD, et al; Authors/Task Force Members; Document Reviewers. 2016 ESC Guidelines for the diagnosis and treatment of acute and chronic heart failure: the Task Force for the diagnosis and treatment of acute and chronic heart failure of the European Society of Cardiology (ESC). Developed with the special contribution of the Heart Failure Association (HFA) of the ESC. Eur J Heart Fail 2016; 18:891–975.
- Piepoli MF, Hoes AW, Agewall S, et al; Authors/Task Force Members. 2016 European guidelines on cardiovascular disease prevention in clinical practice. The Sixth Joint Task Force of the European Society of Cardiology and Other Societies on Cardiovascular Disease Prevention in Clinical Practice (constituted by representatives of 10 societies and by invited experts). Developed with the special contribution of the European Association for Cardiovascular Prevention & Rehabilitation. Eur Heart J 2016; 37:2315–2381.
- American Diabetes Association. American Diabetes Association standards of medical care in diabetes. Diabetes Care 2017; 40(suppl 1):S1–S135.
- Lincoff AM, Tardif JC, Schwartz GG, et al; AleCardio Investigators. Effect of aleglitazar on cardiovascular outcomes after acute coronary syndrome in patients with type 2 diabetes mellitus: the AleCardio randomized clinical trial. JAMA 2014; 311:1515–1525.
- Kaku K, Enya K, Nakaya R, Ohira T, Matsuno R. Efficacy and safety of fasiglifam (TAK0*&%), a G protein-coupled receptor 40 agonist, in Japanese patients with type 2 diabetes inadequately controlled by diet and exercise: a randomized, double-blind, placebocontrolled, phase III trial. Diabetes Obes Metab 2015; 17: 675–681.
- Takeda Press Release. Takeda announces termination of fasiglifam (TAK-875) development. www.takeda.us/newsroom/press_release_detail.aspx?year=2013&id=296. Accessed September 9, 2017.
- US Food and Drug Administration. Guidance for industry. Diabetes mellitus—evaluating cardiovascular risk in new antidiabetic therapies to treat type 2 diabetes. www.fda.gov/downloads/Drugs/.../Guidances/ucm071627.pdf. Accessed September 1, 2017.
- Ye Y, Keyes KT, Zhang C, Perez-Polo JR, Lin Y, Birnbaum Y. The myocardial infarct size-limiting effect of sitagliptin is PKA-dependent, whereas the protective effect of pioglitazone is partially dependent on PKA. Am J Physiol Heart Circ Physiol 2010; 298:H1454–H1465.
- Hocher B, Sharkovska Y, Mark M, Klein T, Pfab T. The novel DPP-4 inhibitors linagliptin and BI 14361 reduce infarct size after myocardial ischemia/reperfusion in rats. Int J Cardiol 2013; 167:87–93.
- Woo JS, Kim W, Ha SJ, et al. Cardioprotective effects of exenatide in patients with ST-segment-elevation myocardial infarction undergoing primary percutaneous coronary intervention: results of exenatide myocardial protection in revascularization study. Arterioscler Thromb Vasc Biol 2013; 33:2252–2260.
- Lønborg J, Vejlstrup N, Kelbæk H, et al. Exenatide reduces reperfusion injury in patients with ST-segment elevation myocardial infarction. Eur Heart J 2012; 33:1491–1499.
- van Poppel PC, Netea MG, Smits P, Tack CJ. Vildagliptin improves endothelium-dependent vasodilatation in type 2 diabetes. Diabetes Care 2011; 34:2072–2077.
- Kröller-Schön S, Knorr M, Hausding M, et al. Glucose-independent improvement of vascular dysfunction in experimental sepsis by dipeptidyl-peptidase 4 inhibition. Cardiovasc Res 2012; 96:140–149.
- Ta NN, Schuyler CA, Li Y, Lopes-Virella MF, Huang Y. DPP-4 (CD26) inhibitor alogliptin inhibits atherosclerosis in diabetic apolipoprotein E-deficient mice. J Cardiovasc Pharmacol 2011; 58:157–166.
- Sauvé M, Ban K, Momen MA, et al. Genetic deletion or pharmacological inhibition of dipeptidyl peptidase-4 improves cardiovascular outcomes after myocardial infarction in mice. Diabetes 2010; 59:1063–1073.
- Read PA, Khan FZ, Heck PM, Hoole SP, Dutka DP. DPP-4 inhibition by sitagliptin improves the myocardial response to dobutamine stress and mitigates stunning in a pilot study of patients with coronary artery disease. Circ Cardiovasc Imaging 2010; 3:195–201.
- Matikainen N, Mänttäri S, Schweizer A, et al. Vildagliptin therapy reduces postprandial intestinal triglyceride-rich lipoprotein particles in patients with type 2 diabetes. Diabetologia 2006; 49:2049–2057.
- Frederich R, Alexander JH, Fiedorek FT, et al. A systematic assessment of cardiovascular outcomes in the saxagliptin drug development program for type 2 diabetes. Postgrad Med 2010; 122:16–27.
- Scirica BM, Bhatt DL, Braunwald E, et al; SAVOR-TIMI 53 Steering Committee and Investigators. Saxagliptin and cardiovascular outcomes in patients with type 2 diabetes mellitus. N Engl J Med 2013; 369:1317–1326.
- Scirica BM, Braunwald E, Raz I, et al; SAVOR-TIMI 53 Steering Committee and Investigators. Heart failure, saxagliptin, and diabetes mellitus: observations from the SAVOR-TIMI 53 randomized trial. Circulation 2014; 130:1579–1588.
- White WB, Cannon CP, Heller SR, et al; EXAMINE Investigators. Alogliptin after acute coronary syndrome in patients with type 2 diabetes. N Engl J Med 2013; 369:1327–1335.
- Zannad F, Cannon CP, Cushman WC, et al; EXAMINE Investigators. Heart failure and mortality outcomes in patients with type 2 diabetes taking alogliptin versus placebo in EXAMINE: a multicentre, randomised, double-blind trial. Lancet 2015; 385:2067–2076.
- US Food and Drug Administration. Diabetes medications containing saxagliptin and alogliptin: drug safety communication—risk of heart failure. https://www.fda.gov/safety/medwatch/safetyinformation/safetyalertsforhumanmedicalproducts/ucm494252.htm. Accessed August 23, 2017.
- Green JB, Bethel MA, Armstrong PW, et al; TECOS Study Group. Effect of sitagliptin on cardiovascular outcomes in type 2 diabetes. N Engl J Med 2015; 373:232–242.
- McGuire DK, Van de Werf F, Armstrong PW, et al; Trial Evaluating Cardiovascular Outcomes With Sitagliptin (TECOS) Study Group. Association between sitagliptin use and heart failure hospitalization and related outcomes in type 2 diabetes mellitus: secondary analysis of a randomized clinical trial. JAMA Cardiol 2016; 1:126–135.
- Pfeffer MA, Claggett B, Diaz R, et al; ELIXA Investigators. Lixisenatide in patients with type 2 diabetes and acute coronary syndrome. N Engl J Med 2015; 373:2247–2257.
- Marso SP, Daniels GH, Brown-Frandsen K, et al; LEADER Steering Committee; LEADER Trial Investigators. Liraglutide and cardiovascular outcomes in type 2 diabetes. N Engl J Med 2016; 375:311–322.
- Marso SP, Bain SC, Consoli A, et al; SUSTAIN-6 Investigators. Semaglutide and cardiovascular outcomes in patients with type 2 diabetes. N Engl J Med 2016; 375:1834–1844.
- Rosenstock J, Balas B, Charbonnel B, et al; T-EMERGE 2 Study Group. The fate of taspoglutide, a weekly GLP-1 receptor agonist, versus twice-daily exenatide for type 2 diabetes: the T-EMERGE 2 trial. Diabetes Care 2013; 36:498–504.
- Wright EM. Renal Na(+)-glucose cotransporters. Am J Physiol 2001; 280:F10–F18.
- Lee YJ, Lee YJ, Han HJ. Regulatory mechanisms of Na(+)/glucose cotransporters in renal proximal tubule cells. Kidney Int 2007; 72(suppl 106):S27–S35.
- Hummel CS, Lu C, Loo DD, Hirayama BA, Voss AA, Wright EM. Glucose transport by human renal Na+/D-glucose cotransporters SGLT1 and SGLT2. Am J Physiol Cell Physiol 2011; 300:C14–C21.
- Heerspink HJ, Perkins BA, Fitchett DH, Husain M, Cherney DZ. Sodium glucose cotransporter 2 inhibitors in the treatment of diabetes mellitus: cardiovascular and kidney effects, potential mechanisms, and clinical applications. Circulation 2016; 134:752–772.
- Lapuerta P, Zambrowicz, Strumph P, Sands A. Development of sotagliflozin, a dual sodium-dependent glucose transporter 1/2 inhibitor. Diabetes Vasc Dis Res 2015; 12:101–110.
- Zinman B, Wanner C, Lachin JM, et al, for the EMPA-REG OUTCOME Investigators. Empagliflozin, cardiovascular outcomes, and mortality in type 2 diabetes. N Engl J Med 2015; 373:2117–2128.
- Neal B, Vlado-Perkovic V, Mahaffey KW, et al, for the CANVAS Program Collaborative Group. Canagloflozin and cardiovascular and renal events in type 2 diabetes. N Engl J Med 2017; 377:644–657.
- Inzucchi SE, Bergenstal RM, Buse JB, et al. Management of hyperglycemia in type 2 diabetes, 2015: a patient-centered approach: update to a position statement of the American Diabetes Association and the European Association for the Study of Diabetes. Diabetes Care 2015; 38:140–149.
- Verma S, McMurray JJV, Cherney DZI. The metabolodiuretic promise of sodium-dependent glucose cotransporter 2 inhibition: the search for the sweet spot in heart failure. JAMA Cardiol. 2017:2(9):939-940. doi:10.1001/jamacardio.2017.1891.
- Ferrannini E, Mark M, Mayoux E. CV protection in the EMPA-REG OUTCOME trial: a “thrifty substrate” hypothesis. Diabetes Care 2016; 39:1108–1114.
- Cherney DZ, Perkins BA, Soleymanlou N, et al. Renal hemodynamic effect of sodium-glucose cotransporter 2 inhibition in patients with type 1 diabetes mellitus. Circulation 2014; 129:587–597.
- Wanner C, Inzucchi SE, Lachin JM, et al, for the EMPA-REG OUTCOME Investigators. Empagliflozin and progression of kidney disease in type 2 diabetes. N Engl J Med 2016; 375:323–334.
- US Food and Drug Administration. FDA News Release. FDA approves Jardiance to reduce cardiovascular death in adults with type 2 diabetes. https://www.fda.gov/newsevents/newsroom/pressannouncements/ucm531517.htm. Accessed August 23, 2017.
- Ponikowski P, Voors AA, Anker SD, et al; Authors/Task Force Members; Document Reviewers. 2016 ESC Guidelines for the diagnosis and treatment of acute and chronic heart failure: the Task Force for the diagnosis and treatment of acute and chronic heart failure of the European Society of Cardiology (ESC). Developed with the special contribution of the Heart Failure Association (HFA) of the ESC. Eur J Heart Fail 2016; 18:891–975.
- Piepoli MF, Hoes AW, Agewall S, et al; Authors/Task Force Members. 2016 European guidelines on cardiovascular disease prevention in clinical practice. The Sixth Joint Task Force of the European Society of Cardiology and Other Societies on Cardiovascular Disease Prevention in Clinical Practice (constituted by representatives of 10 societies and by invited experts). Developed with the special contribution of the European Association for Cardiovascular Prevention & Rehabilitation. Eur Heart J 2016; 37:2315–2381.
- American Diabetes Association. American Diabetes Association standards of medical care in diabetes. Diabetes Care 2017; 40(suppl 1):S1–S135.
- Lincoff AM, Tardif JC, Schwartz GG, et al; AleCardio Investigators. Effect of aleglitazar on cardiovascular outcomes after acute coronary syndrome in patients with type 2 diabetes mellitus: the AleCardio randomized clinical trial. JAMA 2014; 311:1515–1525.
- Kaku K, Enya K, Nakaya R, Ohira T, Matsuno R. Efficacy and safety of fasiglifam (TAK0*&%), a G protein-coupled receptor 40 agonist, in Japanese patients with type 2 diabetes inadequately controlled by diet and exercise: a randomized, double-blind, placebocontrolled, phase III trial. Diabetes Obes Metab 2015; 17: 675–681.
- Takeda Press Release. Takeda announces termination of fasiglifam (TAK-875) development. www.takeda.us/newsroom/press_release_detail.aspx?year=2013&id=296. Accessed September 9, 2017.
KEY POINTS
- Saxagliptin, alogliptin, and sitagliptin confer neither benefit nor harm for the composite outcome of cardiovascular death, myocardial infarction, or stroke. Saxagliptin and alogliptin carry warnings of increased risk of heart failure; sitagliptin was shown to not affect heart failure risk.
- Liraglutide and semaglutide showed evidence of cardiovascular benefit; lixisenatide was noninferior to placebo.
- Empagliflozin is now approved to reduce risk of cardiovascular death in patients with type 2 diabetes and atherosclerotic cardiovascular disease.
- Canagliflozin decreased the composite outcome of cardiovascular death, nonfatal myocardial infarction, or nonfatal stroke in patients with type 2 diabetes with or at risk of cardiovascular disease, but also increased the risk of amputation and did not significantly reduce the individual outcome of cardiovascular death.
Antibiotic stewardship: Why we must, how we can
Antibiotic stewardship has always been a good idea. Now it is also required by the Joint Commission and the Center for Medicare and Medicaid Services (CMS). This article reviews the state of antibiotic use in the United States and efforts to improve antibiotic stewardship in practice.
ANTIBIOTICS ARE DIFFERENT FROM OTHER DRUGS
Their efficacy wanes over time. Antibiotics are the only medications that become less useful over time even if used correctly. Although other types of drugs are continuously being improved, the old ones work as well today as they did when they first came out. But antibiotics that were in use 50 years ago are no longer as effective.
They are a shared resource. Antibiotics are regularly used by many specialties to deliver routine and advanced medical care. Surgeries, transplantation, and immunosuppressive therapy would be unsafe without antibiotics to treat infections. Some patients awaiting lung transplant are not considered good candidates if they have evidence of colonization by antibiotic-resistant organisms.
Individual use may harm others. Even people who are not exposed to an antibiotic can suffer the consequences of how others use them.
In a retrospective cohort study, Freedberg et al1 analyzed the risk of hospitalized patients developing Clostridium difficile infection and found that the risk was higher if the previous occupant of the bed had received antibiotics. The putative mechanism is that a patient receiving antibiotics develops altered gut flora, leading to C difficile spores released into the environment and not eradicated by normal cleaning. The next patient using the bed is then exposed and infected.
ANTIBIOTIC USE IS HIGH
The US Centers for Disease Control (CDC) monitors antibiotic prescriptions throughout the United States. In the outpatient setting, enough antibiotics are prescribed nationwide for 5 out of every 6 people to get 1 course of antibiotics annually (835 prescriptions per 1,000 people). Rates vary widely among states, with the lowest rate in Alaska (501 prescriptions per 1,000 people) and the highest in West Virginia (1,285 prescriptions per 1,000 people).2 In comparison, Scandinavian countries prescribe about 400 courses per 1,000 people, about 20% less than our lowest-prescribing state.3
Antibiotics are probably the most frequently prescribed drugs in US hospitals. Data from 2006 to 2012 showed that 55% of hospitalized patients received at least 1 dose of an antibiotic and that overall about 75% of all hospital days involved an antibiotic.4 Rates did not vary by hospital size, but nonteaching hospitals tended to use antibiotics more than teaching hospitals. Antibiotic use is much more common in intensive care units than in hospital wards (1,092 and 720 days of antibiotic treatment per 1,000 patient-days, respectively).
Although overall antibiotic use did not change significantly over the years of the survey, use patterns did: fluoroquinolone use dropped by 20%, possibly reflecting rising resistance or increased attention to associated side effects (although fluoroquinolones remain the most widely prescribed inpatient antibiotic class), and use of first-generation cephalosporins fell by 7%. A cause for concern is that the use of broad-spectrum and “last-resort” antibiotics increased: carbapenem use by 37%, vancomycin use by 32%, beta-lactam/beta-lactamase inhibitor use by 26%, and third- and fourth-generation cephalosporin use by 12%.4
About one-third of use is unnecessary
Many studies have tried to measure the extent of inappropriate or unnecessary antibiotic use. The results have been remarkably consistent at 20% to 40% for both inpatient and outpatient studies. One study of hospitalized patients not in the intensive care unit found that 30% of 1,941 days of prescribed antimicrobial therapy were unnecessary, mostly because patients received antibiotics for longer than needed or because antibiotics were used to treat noninfectious syndromes or colonizing microorganisms.5
ANTIBIOTIC EXPOSURE HAS NEGATIVE CONSEQUENCES
Any exposure to a medication involves the potential for side effects; this is true for antibiotics whether or not their use is appropriate. An estimated 140,000 visits to emergency departments occur annually for adverse reactions to antibiotics.6 In hospitalized patients, these reactions can be severe, including renal and bone marrow toxicity. As with any medications, the risks and benefits of antibiotic therapy must be weighed patient by patient.
Disturbance of gut microbiome
Antibiotics’ disruptive effects on normal gut flora are becoming better understood and are even believed to increase the risk of obesity and asthma.7,8
Animal models provide evidence that altered flora is associated with sepsis, which is attributed to the gut microbiome’s role in containing dissemination of bacteria in the body.9 An ecological study provides further evidence. Baggs et al10 retrospectively studied more than 9 million patients discharged without sepsis from 473 US hospitals, of whom 0.6% were readmitted for sepsis within 90 days. Exposure to a broad-spectrum antibiotic was associated with a 50% increased risk of readmission within 90 days of discharge because of sepsis (odds ratio 1.50, 95% confidence interval 1.47–1.53).
Increase of C difficile infections
Antibiotics exert selective pressure, killing susceptible bacteria and allowing resistant bacteria to thrive.
The risk of C difficile infection is 7 to 10 times higher than at baseline for 1 month after antibiotic use and 3 times higher than baseline in the 2 months after that.11 Multiple studies have found that stewardship efforts to reduce antibiotic use have resulted in fewer C difficile infections.
A nationwide effort in England over the past decade to reduce C difficile infections has resulted in 50% less use of fluoroquinolones and third-generation cephalosporins in patients over age 65. During that time, the incidence of C difficile infection in that age group fell by about 70%, with concomitant reductions in mortality and colectomy associated with infection. No increase in rates of hospital admissions, infection complications, or death were observed.12–14
GOAL: BETTER CARE (NOT CHEAPER CARE OR LESS ANTIBIOTIC USE)
The primary goal of antibiotic stewardship is better patient care. The goal is not reduced antibiotic use or cost savings, although these could be viewed as favorable side effects. Sometimes, better patient care involves using more antibiotics: eg, a patient with presumed sepsis should be started quickly on broad-spectrum antibiotics, an action that also falls under antibiotic stewardship. The focus for stewardship efforts should be on optimizing appropriate use, ie, promoting the use of the right agent at the correct dosage and for the proper duration.
Stewardship improves clinical outcomes
Antibiotic stewardship is important not only to society but to individual patients.
Singh et al15 randomized patients suspected of having ventilator-associated pneumonia (but with a low likelihood of pneumonia) to either a 3-day course of ciprofloxacin or standard care (antibiotics for 10 to 21 days, with the drug and duration chosen by the treating physician). After 3 days, the patients in the experimental group were reevaluated, and antibiotics were stopped if the likelihood of pneumonia was still deemed low. In patients who received only the short course of antibiotics, mean length of stay in the intensive care unit was 9 days and the risk of acquiring an antibiotic-resistant superinfection during hospitalization was 14%, compared with a 15-day length of stay and 38% risk of antibiotic-resistant superinfection in patients in the standard treatment group.
Fishman16 reported a study at a single hospital that randomized patients to either receive standard care according to physician choice or be treated according to an antibiotic stewardship program. Patients in the antibiotic stewardship group were almost 3 times more likely than controls to receive appropriate therapy according to guidelines. More important, the antibiotic stewardship patients were almost twice as likely to be cured of their infection and were more than 80% less likely to have treatment failure.
DEVELOPING EFFECTIVE ANTIBIOTIC STEWARDSHIP PROGRAMS
A good model for improving antibiotic use is a recent nationwide program designed to reduce central line-associated bloodstream infections.17 Rates of these infections have dropped by about 50% over the past 5 years. The program included:
- Research to better understand the problem and how to fight it
- Well-defined programs and interventions
- Education to implement interventions, eg, deploying teams to teach better techniques of inserting and maintaining central lines
- A strong national measurement system (the CDC’s National Healthcare Safety Network) to track infections.
What constitutes an antibiotic stewardship program?
The CDC examined successful stewardship programs in a variety of hospital types, including large academic hospitals and smaller hospitals, and identified 7 common core elements that could serve as general principles that were common to successful antibiotic stewardship programs18:
- Leadership commitment from administration
- A single leader responsible for outcomes
- A single pharmacy leader
- Tracking of antibiotic use
- Regular reporting of antibiotic use and resistance
- Educating providers on use and resistance
- Specific improvement interventions.
Stewardship is harder in some settings
In reply to a CDC survey in 2014, 41% of more than 4,000 hospitals reported that they had antibiotic stewardship programs with all 7 core elements. The single element that predicted whether a complete program was in place was leadership support.19 The following year, 48% of respondents reported that they had a complete program in place. Percentages varied among states, with highs in Utah (77%) and California (70%) and lows in North Dakota (12%) and Vermont (7%). Large hospitals and major teaching hospitals were more likely to have a program with all 7 elements: 31% of hospitals with 50 or fewer beds had a complete program vs 66% of hospitals with at least 200 beds.20
Short-stay, critical-access hospitals pose a special challenge, as only 26% reported having all core elements.19,20 These facilities have fewer than 25 beds, and many patient stays are less than 3 days. Some do not employ full-time pharmacists or full-time clinicians. The CDC is collaborating with the American Hospital Association and the Pew Charitable Trusts to focus efforts on helping these hospitals, which requires a more flexible approach. About 100 critical-access hospitals nationwide have reported implementing all of the core elements and can serve as models for the others.
MEASURING IMPROVEMENT
The CDC has adopted a 3-pronged approach to measuring improvements in hospital antibiotic use:
- Estimate national aggregate antibiotic use described above
- Acquire information on antibiotic use at facility, practice, and provider levels
- Assess appropriate antibiotic use.
In hospitals, the CDC has concentrated on facility-level measurement. Hospitals need a system to track their own use and compare it with that of similar facilities. The CDC’s monitoring program, the Antibiotic Use Option of the National Healthcare Safety Network, captures electronic data on antibiotic use in a facility, enabling monitoring of use in each unit. Data can also be aggregated at regional, state, and national levels. This information can be used to develop benchmarks for antibiotic use, so that similar hospitals can be compared.
What is the ‘right’ amount of antibiotic use? Enter SAAR
Creating benchmarks for antibiotic use poses a number of challenges compared with most other areas in healthcare. Most public health measures are binary—eg, people either get an infection, a vaccination, or a smoking cessation intervention or not—and the direction of progress is clear. Antibiotics are different: not everybody needs them, but some people do. Usage should be reduced, but by exactly how much is unclear and varies between hospitals. In addition, being an outlier does not necessarily indicate a problem: a hospital unit for organ transplants will have high rates of antibiotic use, which is likely appropriate.
The CDC has taken initial steps to develop a risk-adjusted benchmark measure for hospital antibiotic use, the Standardized Antimicrobial Administration Ratio (SAAR). It compares a hospital’s observed antibiotic use with a calculation of predicted use based on its facility characteristics. Although still at an early stage, SAAR has been released and has been endorsed by the National Quality Forum. About 200 hospitals are submitting data to the CDC and collaborating with the CDC to evaluate the SAAR’s utility in driving improved antibiotic use.
Problems in measuring appropriate use
Measuring appropriate antibiotic use is easier in the outpatient setting, where detailed data have been collected for many years.
Fleming-Dutra et al21 compared medications prescribed during outpatient visits and the diagnoses coded for the visits. They found that about 13% of all outpatient visits resulted in an antibiotic prescription, 30% of which had no listed diagnosis that would justify an antibiotic (eg, viral upper respiratory infection). This kind of information provides a target for stewardship programs.
It is more difficult to conduct such a study in a hospital setting. Simply comparing discharge diagnoses to antibiotics prescribed is not useful: often antibiotics are started presumptively on admission for a patient with signs and symptoms of an infection, then stopped if the diagnosis does not warrant antibiotics, which is a reasonable strategy.
Also, many times, a patient with asymptomatic bacteriuria, which does not warrant antibiotics, is misdiagnosed as having a urinary tract infection, which does. So simply looking at the discharge code may not reveal whether therapy was appropriate.
Some studies have provided useful information. Fridkin et al22 studied 36 hospitals for the use of vancomycin, which is an especially good candidate drug for study because guidelines exist for appropriate use. Data were collected only from patients given vancomycin for more than 3 days, which should have eliminated empiric use of the drug and included only pathogen-driven therapy. Cases where therapy was for skin and soft-tissue infections were excluded because cultures are not usually obtained for these cases. Of patients given vancomycin, 9% had no diagnostic culture obtained at antibiotic initiation, 22% had diagnostic culture but results showed no gram-positive bacterial growth, and 5% had culture results revealing only oxacillin-susceptible Staphylococcus aureus. In 36% of cases, opportunities existed for improved prescribing.
Such data could be collected from the electronic medical record, and the CDC is focusing efforts in this direction.
NATIONAL ACTIVITIES IN ANTIBIOTIC STEWARDSHIP
In 2014, the White House launched a national strategy to combat antibiotic resistance,23 followed by an action plan in 2015.24 As a result, new investments have been made to improve antibiotic use, including funding for state health departments to begin stewardship efforts and to expand public awareness of the problems of antibiotic overuse. Research efforts are also being funded to improve implementation of existing stewardship practices and to develop new ones.
CMS is also exploring how to drive improved antibiotic use. In October 2016, it started requiring all US nursing homes to have antibiotic stewardship programs, and a similar requirement for hospitals has been proposed.
The Joint Commission issued a standard requiring that all their accredited facilities, starting with hospitals, have an antibiotic stewardship program by January 2017. This standard requires implementation of all the CDC’s core elements.
PROVEN INTERVENTIONS
Focusing on key interventions that are likely to be effective and well received by providers is a useful strategy for antibiotic stewardship efforts. A number of such interventions have been supported by research.
Postprescription antibiotic reviews or antibiotic ‘time-outs’
Antibiotics are often started empirically to treat hospitalized patients suspected of having an infection. The need for the antibiotic should be assessed a few days later, when culture results and more clinical information are available.
Elligsen et al25 evaluated the effects of providing a formal review and suggestions for antimicrobial optimization to critical care teams of 3 intensive care units in a single hospital after 3 and 10 days of antibiotic therapy. Mean monthly antibiotic use decreased from 644 days of therapy per 1,000 patient-days in the preintervention period to 503 days of therapy per 1,000 patient-days (P < .0001). C difficile infections were reduced from 11 cases to 6. Overall gram-negative susceptibility to meropenem increased in the critical care units.
Targeting specific infections
Some infections are especially important to target with improvement efforts.
In 2011, Magill et al26 conducted 1-day prevalence surveys in 183 hospitals in 10 states to examine patterns of antibiotic use. They found that lower respiratory tract infections and urinary tract infections accounted for more than half of all antibiotic use (35% and 22%, respectively), making them good candidates for improved use.
Community-acquired pneumonia can be targeted at multiple fronts. One study showed that almost 30% of patients diagnosed with community-acquired pneumonia in the emergency department did not actually have pneumonia.27 Duration of antibiotic therapy could also be targeted. Guidelines recommend that most patients with uncomplicated community-acquired pneumonia receive 5 to 7 days of antibiotic therapy. Avdic et al28 performed a simple intervention involving education and feedback to teams in 1 hospital regarding antibiotic choice and duration. This resulted in reducing the duration of therapy for community-acquired pneumonia from a median of 10 to 7 days.
Asymptomatic bacteriuria is often misdiagnosed as a urinary tract infection and treated unnecessarily.29–31
Trautner et al32 addressed this problem by targeting urine cultures rather than antibiotics, using a simple algorithm: if a patient did not have symptoms of urinary tract infection (fever, acute hematuria, delirium, rigors, flank pain, pelvic discomfort, urgency, frequency, dysuria, suprapubic pain), a urine culture was not recommended. If a patient did have symptoms but a problem other than urinary tract infection was deemed likely, evaluation of other sources of infection was recommended. Use of the algorithm resulted in fewer urine cultures and less antibiotic overtreatment of asymptomatic bacteriuria. Reductions persisted after the intervention ended.
Antibiotic time-out at hospital discharge
Another study evaluated an intervention that required a pharmacist consultation for the critical care team when a patient was to be discharged with intravenous antibiotics (most often for pneumonia). In 28% of cases, chart review revealed that the infection had been completely treated at the time of discharge, so further antibiotic treatment was not indicated. No patients who avoided antibiotics at discharge were readmitted or subsequently visited the emergency department.33
Targeting outpatient settings
A number of studies have evaluated simple interventions to improve outpatient antibiotic prescribing. Meeker et al34 had providers place a poster in their examination rooms with a picture of the physician and a signed letter committing to the appropriate use of antibiotics. Inappropriate antibiotic use decreased 20% in the intervention group vs controls (P = .02).
In a subsequent study,35 the same group required providers to include a justification note in the electronic medical record every time an antibiotic was prescribed for an indication when guidelines do not recommend one. Inappropriate prescribing dropped from 23% to 5% (P < .001).
Another intervention in this study35 provided physicians with periodic feedback according to whether their therapy was concordant with guidelines. They received an email with a subject line of either “You are a top performer” or “You are not a top performer.” The contents of the email provided data on how many antibiotic prescriptions they wrote for conditions that did not warrant them and how their prescribing habits compared with those of their top-performing peers. Mean inappropriate antibiotic prescribing fell from 20% to 4%.35
This is a critical time for antibiotic stewardship efforts in the United States. The need has never been more urgent and, fortunately, the opportunities have never been more abundant. Requirements for stewardship programs will drive implementation, but hospitals will need support and guidance to help ensure that stewardship programs are as effective as possible. Ultimately, improving antibiotic use will require collaboration among all stakeholders. CDC is eager to partner with providers and others in their efforts to improve antibiotic use.
- Freedberg DE, Salmasian H, Cohen B, Abrams JA, Larson EL. Receipt of antibiotics in hospitalized patients and risk for Clostridium difficile infection in subsequent patients who occupy the same bed. JAMA Intern Med 2016; 176:1801–1808.
- Centers for Disease Control and Prevention. Get smart: know when antibiotics work. Measuring outpatient antibiotic prescribing. https://www.cdc.gov/getsmart/community/programs-measurement/measuring-antibiotic-prescribing.html. Accessed February 5, 2017.
- Ternhag A, Hellman J. More on U.S. outpatient antibiotic prescribing, 2010. N Engl J Med 2013; 369:1175–1176.
- Baggs J, Fridkin SK, Pollack LA, Srinivasan A, Jernigan JA. Estimating national trends in inpatient antibiotic use among US hospitals from 2006 to 2012. JAMA Intern Med 2016; 176:1639–1648.
- Hecker MT, Aron DC, Patel NP, Lehmann MK, Donskey CJ. Unnecessary use of antimicrobials in hospitalized patients: current patterns of misuse with an emphasis on the antianaerobic spectrum of activity. Arch Intern Med 2003; 163:972–978.
- Shehab N, Patel PR, Srinivasan A, Budnitz DS. Emergency department visits for antibiotic-associated adverse events. Clin Infect Dis 2008; 47:735–743.
- Korpela K, de Vos WM. Antibiotic use in childhood alters the gut microbiota and predisposes to overweight. Microb Cell 2016; 3:296–298.
- Gray LE, O’Hely M, Ranganathan S, Sly PD, Vuillermin P. The maternal diet, gut bacteria, and bacterial metabolites during pregnancy influence offspring asthma. Front Immunol 2017; 8:365
- Haak BW, Wiersinga WJ. The role of the gut microbiota in sepsis. Lancet Gastroenterol Hepatol 2017; 2:135–143.
- Baggs J, Jernigan J, Mccormick K, Epstein L, Laufer-Halpin AS, Mcdonald C. Increased risk of sepsis during hospital readmission following exposure to certain antibiotics during hospitalization. Abstract presented at IDWeek, October 26-30, 2016, New Orleans, LA. https://idsa.confex.com/idsa/2016/webprogram/Paper58587.html. Accessed August 8, 2017.
- Hensgens MP, Goorhuis A, Dekkers OM, Kuijper EJ. Time interval of increased risk for Clostridium difficile infection after exposure to antibiotics. J Antimicrob Chemother 2012; 67:742–748.
- Ashiru-Oredope D, Sharland M, Charani E, McNulty C, Cooke J; ARHAI Antimicrobial Stewardship Group. Improving the quality of antibiotic prescribing in the NHS by developing a new antimicrobial stewardship programme: Start Smart—Then Focus. J Antimicrob Chemother 2012; 67(suppl 1):i51–i63.
- Wilcox MH, Shetty N, Fawley WN, et al. Changing epidemiology of Clostridium difficile infection following the introduction of a national ribotyping-based surveillance scheme in England. Clin Infect Dis 2012; 55:1056-1063.
- Public Health England. Clostridium difficile infection: monthly data by NHS acute trust. https://www.gov.uk/government/statistics/clostridium-difficile-infection-monthly-data-by-nhs-acute-trust. Accessed August 4, 2017.
- Singh N, Rogers P, Atwood CW, Wagener MM, Yu VL. Short-course empiric antibiotic therapy for patients with pulmonary infiltrates in the intensive care unit. A proposed solution for indiscriminate antibiotic prescription. Am J Respir Crit Care Med 2000; 162:505–511.
- Fishman N. Antimicrobial stewardship. Am J Med 2006; 119:S53–S61.
- US Centers for Disease Control and Prevention. Healthcare-associated infections (HAI) progress report. https://www.cdc.gov/hai/surveillance/progress-report/index.html. Accessed August 4, 2017.
- US Centers for Disease Control and Prevention. Get Smart for Healthcare. Core elements of hospital antibiotic stewardship programs. https://www.cdc.gov/getsmart/healthcare/implementation/core-elements.html. Accessed August 8, 2017.
- Pollack LA, van Santen KL, Weiner LM, Dudeck MA, Edwards JR, Srinivasan A. Antibiotic stewardship programs in U.S. acute care hospitals: findings from the 2014 National Healthcare Safety Network Annual Hospital Survey. Clin Infect Dis 2016; 63:443–449.
- US Centers for Disease Control and Prevention. Antibiotic stewardship in acute care hospitals by state 2014. https://gis.cdc.gov/grasp/PSA/STMapView.html. Accessed August 4, 2017.
- Fleming-Dutra KE, Hersh AL, Shapiro DJ, et al. Prevalence of inappropriate antibiotic prescriptions among US ambulatory care visits, 2010–2011. JAMA 2016; 315:1864-1873.
- Fridkin S, Baggs J, Fagan R, et al; Centers for Disease Control and Prevention (CDC). Vital signs: improving antibiotic use among hospitalized patients. MMWR Morb Mortal Wkly Rep 2014; 63:194–200.
- National strategy for combating antibiotic-resistant bacteria. September 2014. https://www.whitehouse.gov/sites/default/files/docs/carb_national_strategy.pdf. Accessed August 9, 2017.
- National action plan for combating antibiotic-resistant Bacteria. March 2015. https://www.whitehouse.gov/sites/default/files/docs/national_action_plan_for_combating_antibotic-resistant_bacteria.pdf
- Elligsen M, Walker SA, Pinto R, et al. Audit and feedback to reduce broad-spectrum antibiotic use among intensive care unit patients: a controlled interrupted time series analysis. Infect Control Hosp Epidemiol 2012; 33:354–361.
- Magill SS, Edwards JR, Beldavs ZG, et al; Emerging Infections Program Healthcare-Associated Infections and Antimicrobial Use Prevalence Survey Team. Prevalence of antimicrobial use in US acute care hospitals, May-September 2011. JAMA 2014; 312:1438-1446.
- Chandra A, Nicks B, Maniago E, Nouh A, Limkakeng A. A multicenter analysis of the ED diagnosis of pneumonia. Am J Emerg Med 2010;28:862–865.
- Avdic E, Cushinotto LA, Hughes AH, et al. Impact of an antimicrobial stewardship intervention on shortening the duration of therapy for community-acquired pneumonia. Clin Infect Dis 2012; 54:1581–1587.
- Dalen DM, Zvonar RK, Jessamine PG, et al. An evaluation of the management of asymptomatic catheter-associated bacteriuria and candiduria at the Ottawa Hospital. Can J Infect Dis Med Micribiol 2005; 16:166–170.
- Gandhi T, Flanders SA, Markovitz E, Saint S, Kaul DR. Importance of urinary tract infection to antibiotic use among hospitalized patients. Infect Control Hosp Epidemiol 2009; 30:193–195.
- Cope M, Cevallos ME, Cadle RM, Darouiche RO, Musher DM, Trautner BW. Inappropriate treatment of catheter-associated asymptomatic bacteriuria in a tertiary care hospital. Clin Infect Dis 2009; 48:1182–1188.
- Trautner BW, Grigoryan L, Petersen NJ, et al. Effectiveness of an antimicrobial stewardship approach for urinary catheter-associated asymptomatic bacteriuria. JAMA Intern Med 2015; 175:1120–1127.
- Shrestha NK, Bhaskaran A, Scalera NM, Schmitt SK, Rehm SJ, Gordon SM. Antimicrobial stewardship at transition of care from hospital to community. Infect Control Hosp Epidemiol 2012; 33:401–404.
- Meeker D, Knight TK, Friedberg MW, et al. Nudging guideline-concordant antibiotic prescribing: a randomized clinical trial. JAMA Intern Med 2014; 174:425–431.
- Meeker D, Linder JA, Fox CR, et al. Effect of behavioral interventions on inappropriate antibiotic prescribing among primary care practices: a randomized clinical trial. JAMA 2016; 315:562–572.
Antibiotic stewardship has always been a good idea. Now it is also required by the Joint Commission and the Center for Medicare and Medicaid Services (CMS). This article reviews the state of antibiotic use in the United States and efforts to improve antibiotic stewardship in practice.
ANTIBIOTICS ARE DIFFERENT FROM OTHER DRUGS
Their efficacy wanes over time. Antibiotics are the only medications that become less useful over time even if used correctly. Although other types of drugs are continuously being improved, the old ones work as well today as they did when they first came out. But antibiotics that were in use 50 years ago are no longer as effective.
They are a shared resource. Antibiotics are regularly used by many specialties to deliver routine and advanced medical care. Surgeries, transplantation, and immunosuppressive therapy would be unsafe without antibiotics to treat infections. Some patients awaiting lung transplant are not considered good candidates if they have evidence of colonization by antibiotic-resistant organisms.
Individual use may harm others. Even people who are not exposed to an antibiotic can suffer the consequences of how others use them.
In a retrospective cohort study, Freedberg et al1 analyzed the risk of hospitalized patients developing Clostridium difficile infection and found that the risk was higher if the previous occupant of the bed had received antibiotics. The putative mechanism is that a patient receiving antibiotics develops altered gut flora, leading to C difficile spores released into the environment and not eradicated by normal cleaning. The next patient using the bed is then exposed and infected.
ANTIBIOTIC USE IS HIGH
The US Centers for Disease Control (CDC) monitors antibiotic prescriptions throughout the United States. In the outpatient setting, enough antibiotics are prescribed nationwide for 5 out of every 6 people to get 1 course of antibiotics annually (835 prescriptions per 1,000 people). Rates vary widely among states, with the lowest rate in Alaska (501 prescriptions per 1,000 people) and the highest in West Virginia (1,285 prescriptions per 1,000 people).2 In comparison, Scandinavian countries prescribe about 400 courses per 1,000 people, about 20% less than our lowest-prescribing state.3
Antibiotics are probably the most frequently prescribed drugs in US hospitals. Data from 2006 to 2012 showed that 55% of hospitalized patients received at least 1 dose of an antibiotic and that overall about 75% of all hospital days involved an antibiotic.4 Rates did not vary by hospital size, but nonteaching hospitals tended to use antibiotics more than teaching hospitals. Antibiotic use is much more common in intensive care units than in hospital wards (1,092 and 720 days of antibiotic treatment per 1,000 patient-days, respectively).
Although overall antibiotic use did not change significantly over the years of the survey, use patterns did: fluoroquinolone use dropped by 20%, possibly reflecting rising resistance or increased attention to associated side effects (although fluoroquinolones remain the most widely prescribed inpatient antibiotic class), and use of first-generation cephalosporins fell by 7%. A cause for concern is that the use of broad-spectrum and “last-resort” antibiotics increased: carbapenem use by 37%, vancomycin use by 32%, beta-lactam/beta-lactamase inhibitor use by 26%, and third- and fourth-generation cephalosporin use by 12%.4
About one-third of use is unnecessary
Many studies have tried to measure the extent of inappropriate or unnecessary antibiotic use. The results have been remarkably consistent at 20% to 40% for both inpatient and outpatient studies. One study of hospitalized patients not in the intensive care unit found that 30% of 1,941 days of prescribed antimicrobial therapy were unnecessary, mostly because patients received antibiotics for longer than needed or because antibiotics were used to treat noninfectious syndromes or colonizing microorganisms.5
ANTIBIOTIC EXPOSURE HAS NEGATIVE CONSEQUENCES
Any exposure to a medication involves the potential for side effects; this is true for antibiotics whether or not their use is appropriate. An estimated 140,000 visits to emergency departments occur annually for adverse reactions to antibiotics.6 In hospitalized patients, these reactions can be severe, including renal and bone marrow toxicity. As with any medications, the risks and benefits of antibiotic therapy must be weighed patient by patient.
Disturbance of gut microbiome
Antibiotics’ disruptive effects on normal gut flora are becoming better understood and are even believed to increase the risk of obesity and asthma.7,8
Animal models provide evidence that altered flora is associated with sepsis, which is attributed to the gut microbiome’s role in containing dissemination of bacteria in the body.9 An ecological study provides further evidence. Baggs et al10 retrospectively studied more than 9 million patients discharged without sepsis from 473 US hospitals, of whom 0.6% were readmitted for sepsis within 90 days. Exposure to a broad-spectrum antibiotic was associated with a 50% increased risk of readmission within 90 days of discharge because of sepsis (odds ratio 1.50, 95% confidence interval 1.47–1.53).
Increase of C difficile infections
Antibiotics exert selective pressure, killing susceptible bacteria and allowing resistant bacteria to thrive.
The risk of C difficile infection is 7 to 10 times higher than at baseline for 1 month after antibiotic use and 3 times higher than baseline in the 2 months after that.11 Multiple studies have found that stewardship efforts to reduce antibiotic use have resulted in fewer C difficile infections.
A nationwide effort in England over the past decade to reduce C difficile infections has resulted in 50% less use of fluoroquinolones and third-generation cephalosporins in patients over age 65. During that time, the incidence of C difficile infection in that age group fell by about 70%, with concomitant reductions in mortality and colectomy associated with infection. No increase in rates of hospital admissions, infection complications, or death were observed.12–14
GOAL: BETTER CARE (NOT CHEAPER CARE OR LESS ANTIBIOTIC USE)
The primary goal of antibiotic stewardship is better patient care. The goal is not reduced antibiotic use or cost savings, although these could be viewed as favorable side effects. Sometimes, better patient care involves using more antibiotics: eg, a patient with presumed sepsis should be started quickly on broad-spectrum antibiotics, an action that also falls under antibiotic stewardship. The focus for stewardship efforts should be on optimizing appropriate use, ie, promoting the use of the right agent at the correct dosage and for the proper duration.
Stewardship improves clinical outcomes
Antibiotic stewardship is important not only to society but to individual patients.
Singh et al15 randomized patients suspected of having ventilator-associated pneumonia (but with a low likelihood of pneumonia) to either a 3-day course of ciprofloxacin or standard care (antibiotics for 10 to 21 days, with the drug and duration chosen by the treating physician). After 3 days, the patients in the experimental group were reevaluated, and antibiotics were stopped if the likelihood of pneumonia was still deemed low. In patients who received only the short course of antibiotics, mean length of stay in the intensive care unit was 9 days and the risk of acquiring an antibiotic-resistant superinfection during hospitalization was 14%, compared with a 15-day length of stay and 38% risk of antibiotic-resistant superinfection in patients in the standard treatment group.
Fishman16 reported a study at a single hospital that randomized patients to either receive standard care according to physician choice or be treated according to an antibiotic stewardship program. Patients in the antibiotic stewardship group were almost 3 times more likely than controls to receive appropriate therapy according to guidelines. More important, the antibiotic stewardship patients were almost twice as likely to be cured of their infection and were more than 80% less likely to have treatment failure.
DEVELOPING EFFECTIVE ANTIBIOTIC STEWARDSHIP PROGRAMS
A good model for improving antibiotic use is a recent nationwide program designed to reduce central line-associated bloodstream infections.17 Rates of these infections have dropped by about 50% over the past 5 years. The program included:
- Research to better understand the problem and how to fight it
- Well-defined programs and interventions
- Education to implement interventions, eg, deploying teams to teach better techniques of inserting and maintaining central lines
- A strong national measurement system (the CDC’s National Healthcare Safety Network) to track infections.
What constitutes an antibiotic stewardship program?
The CDC examined successful stewardship programs in a variety of hospital types, including large academic hospitals and smaller hospitals, and identified 7 common core elements that could serve as general principles that were common to successful antibiotic stewardship programs18:
- Leadership commitment from administration
- A single leader responsible for outcomes
- A single pharmacy leader
- Tracking of antibiotic use
- Regular reporting of antibiotic use and resistance
- Educating providers on use and resistance
- Specific improvement interventions.
Stewardship is harder in some settings
In reply to a CDC survey in 2014, 41% of more than 4,000 hospitals reported that they had antibiotic stewardship programs with all 7 core elements. The single element that predicted whether a complete program was in place was leadership support.19 The following year, 48% of respondents reported that they had a complete program in place. Percentages varied among states, with highs in Utah (77%) and California (70%) and lows in North Dakota (12%) and Vermont (7%). Large hospitals and major teaching hospitals were more likely to have a program with all 7 elements: 31% of hospitals with 50 or fewer beds had a complete program vs 66% of hospitals with at least 200 beds.20
Short-stay, critical-access hospitals pose a special challenge, as only 26% reported having all core elements.19,20 These facilities have fewer than 25 beds, and many patient stays are less than 3 days. Some do not employ full-time pharmacists or full-time clinicians. The CDC is collaborating with the American Hospital Association and the Pew Charitable Trusts to focus efforts on helping these hospitals, which requires a more flexible approach. About 100 critical-access hospitals nationwide have reported implementing all of the core elements and can serve as models for the others.
MEASURING IMPROVEMENT
The CDC has adopted a 3-pronged approach to measuring improvements in hospital antibiotic use:
- Estimate national aggregate antibiotic use described above
- Acquire information on antibiotic use at facility, practice, and provider levels
- Assess appropriate antibiotic use.
In hospitals, the CDC has concentrated on facility-level measurement. Hospitals need a system to track their own use and compare it with that of similar facilities. The CDC’s monitoring program, the Antibiotic Use Option of the National Healthcare Safety Network, captures electronic data on antibiotic use in a facility, enabling monitoring of use in each unit. Data can also be aggregated at regional, state, and national levels. This information can be used to develop benchmarks for antibiotic use, so that similar hospitals can be compared.
What is the ‘right’ amount of antibiotic use? Enter SAAR
Creating benchmarks for antibiotic use poses a number of challenges compared with most other areas in healthcare. Most public health measures are binary—eg, people either get an infection, a vaccination, or a smoking cessation intervention or not—and the direction of progress is clear. Antibiotics are different: not everybody needs them, but some people do. Usage should be reduced, but by exactly how much is unclear and varies between hospitals. In addition, being an outlier does not necessarily indicate a problem: a hospital unit for organ transplants will have high rates of antibiotic use, which is likely appropriate.
The CDC has taken initial steps to develop a risk-adjusted benchmark measure for hospital antibiotic use, the Standardized Antimicrobial Administration Ratio (SAAR). It compares a hospital’s observed antibiotic use with a calculation of predicted use based on its facility characteristics. Although still at an early stage, SAAR has been released and has been endorsed by the National Quality Forum. About 200 hospitals are submitting data to the CDC and collaborating with the CDC to evaluate the SAAR’s utility in driving improved antibiotic use.
Problems in measuring appropriate use
Measuring appropriate antibiotic use is easier in the outpatient setting, where detailed data have been collected for many years.
Fleming-Dutra et al21 compared medications prescribed during outpatient visits and the diagnoses coded for the visits. They found that about 13% of all outpatient visits resulted in an antibiotic prescription, 30% of which had no listed diagnosis that would justify an antibiotic (eg, viral upper respiratory infection). This kind of information provides a target for stewardship programs.
It is more difficult to conduct such a study in a hospital setting. Simply comparing discharge diagnoses to antibiotics prescribed is not useful: often antibiotics are started presumptively on admission for a patient with signs and symptoms of an infection, then stopped if the diagnosis does not warrant antibiotics, which is a reasonable strategy.
Also, many times, a patient with asymptomatic bacteriuria, which does not warrant antibiotics, is misdiagnosed as having a urinary tract infection, which does. So simply looking at the discharge code may not reveal whether therapy was appropriate.
Some studies have provided useful information. Fridkin et al22 studied 36 hospitals for the use of vancomycin, which is an especially good candidate drug for study because guidelines exist for appropriate use. Data were collected only from patients given vancomycin for more than 3 days, which should have eliminated empiric use of the drug and included only pathogen-driven therapy. Cases where therapy was for skin and soft-tissue infections were excluded because cultures are not usually obtained for these cases. Of patients given vancomycin, 9% had no diagnostic culture obtained at antibiotic initiation, 22% had diagnostic culture but results showed no gram-positive bacterial growth, and 5% had culture results revealing only oxacillin-susceptible Staphylococcus aureus. In 36% of cases, opportunities existed for improved prescribing.
Such data could be collected from the electronic medical record, and the CDC is focusing efforts in this direction.
NATIONAL ACTIVITIES IN ANTIBIOTIC STEWARDSHIP
In 2014, the White House launched a national strategy to combat antibiotic resistance,23 followed by an action plan in 2015.24 As a result, new investments have been made to improve antibiotic use, including funding for state health departments to begin stewardship efforts and to expand public awareness of the problems of antibiotic overuse. Research efforts are also being funded to improve implementation of existing stewardship practices and to develop new ones.
CMS is also exploring how to drive improved antibiotic use. In October 2016, it started requiring all US nursing homes to have antibiotic stewardship programs, and a similar requirement for hospitals has been proposed.
The Joint Commission issued a standard requiring that all their accredited facilities, starting with hospitals, have an antibiotic stewardship program by January 2017. This standard requires implementation of all the CDC’s core elements.
PROVEN INTERVENTIONS
Focusing on key interventions that are likely to be effective and well received by providers is a useful strategy for antibiotic stewardship efforts. A number of such interventions have been supported by research.
Postprescription antibiotic reviews or antibiotic ‘time-outs’
Antibiotics are often started empirically to treat hospitalized patients suspected of having an infection. The need for the antibiotic should be assessed a few days later, when culture results and more clinical information are available.
Elligsen et al25 evaluated the effects of providing a formal review and suggestions for antimicrobial optimization to critical care teams of 3 intensive care units in a single hospital after 3 and 10 days of antibiotic therapy. Mean monthly antibiotic use decreased from 644 days of therapy per 1,000 patient-days in the preintervention period to 503 days of therapy per 1,000 patient-days (P < .0001). C difficile infections were reduced from 11 cases to 6. Overall gram-negative susceptibility to meropenem increased in the critical care units.
Targeting specific infections
Some infections are especially important to target with improvement efforts.
In 2011, Magill et al26 conducted 1-day prevalence surveys in 183 hospitals in 10 states to examine patterns of antibiotic use. They found that lower respiratory tract infections and urinary tract infections accounted for more than half of all antibiotic use (35% and 22%, respectively), making them good candidates for improved use.
Community-acquired pneumonia can be targeted at multiple fronts. One study showed that almost 30% of patients diagnosed with community-acquired pneumonia in the emergency department did not actually have pneumonia.27 Duration of antibiotic therapy could also be targeted. Guidelines recommend that most patients with uncomplicated community-acquired pneumonia receive 5 to 7 days of antibiotic therapy. Avdic et al28 performed a simple intervention involving education and feedback to teams in 1 hospital regarding antibiotic choice and duration. This resulted in reducing the duration of therapy for community-acquired pneumonia from a median of 10 to 7 days.
Asymptomatic bacteriuria is often misdiagnosed as a urinary tract infection and treated unnecessarily.29–31
Trautner et al32 addressed this problem by targeting urine cultures rather than antibiotics, using a simple algorithm: if a patient did not have symptoms of urinary tract infection (fever, acute hematuria, delirium, rigors, flank pain, pelvic discomfort, urgency, frequency, dysuria, suprapubic pain), a urine culture was not recommended. If a patient did have symptoms but a problem other than urinary tract infection was deemed likely, evaluation of other sources of infection was recommended. Use of the algorithm resulted in fewer urine cultures and less antibiotic overtreatment of asymptomatic bacteriuria. Reductions persisted after the intervention ended.
Antibiotic time-out at hospital discharge
Another study evaluated an intervention that required a pharmacist consultation for the critical care team when a patient was to be discharged with intravenous antibiotics (most often for pneumonia). In 28% of cases, chart review revealed that the infection had been completely treated at the time of discharge, so further antibiotic treatment was not indicated. No patients who avoided antibiotics at discharge were readmitted or subsequently visited the emergency department.33
Targeting outpatient settings
A number of studies have evaluated simple interventions to improve outpatient antibiotic prescribing. Meeker et al34 had providers place a poster in their examination rooms with a picture of the physician and a signed letter committing to the appropriate use of antibiotics. Inappropriate antibiotic use decreased 20% in the intervention group vs controls (P = .02).
In a subsequent study,35 the same group required providers to include a justification note in the electronic medical record every time an antibiotic was prescribed for an indication when guidelines do not recommend one. Inappropriate prescribing dropped from 23% to 5% (P < .001).
Another intervention in this study35 provided physicians with periodic feedback according to whether their therapy was concordant with guidelines. They received an email with a subject line of either “You are a top performer” or “You are not a top performer.” The contents of the email provided data on how many antibiotic prescriptions they wrote for conditions that did not warrant them and how their prescribing habits compared with those of their top-performing peers. Mean inappropriate antibiotic prescribing fell from 20% to 4%.35
This is a critical time for antibiotic stewardship efforts in the United States. The need has never been more urgent and, fortunately, the opportunities have never been more abundant. Requirements for stewardship programs will drive implementation, but hospitals will need support and guidance to help ensure that stewardship programs are as effective as possible. Ultimately, improving antibiotic use will require collaboration among all stakeholders. CDC is eager to partner with providers and others in their efforts to improve antibiotic use.
Antibiotic stewardship has always been a good idea. Now it is also required by the Joint Commission and the Center for Medicare and Medicaid Services (CMS). This article reviews the state of antibiotic use in the United States and efforts to improve antibiotic stewardship in practice.
ANTIBIOTICS ARE DIFFERENT FROM OTHER DRUGS
Their efficacy wanes over time. Antibiotics are the only medications that become less useful over time even if used correctly. Although other types of drugs are continuously being improved, the old ones work as well today as they did when they first came out. But antibiotics that were in use 50 years ago are no longer as effective.
They are a shared resource. Antibiotics are regularly used by many specialties to deliver routine and advanced medical care. Surgeries, transplantation, and immunosuppressive therapy would be unsafe without antibiotics to treat infections. Some patients awaiting lung transplant are not considered good candidates if they have evidence of colonization by antibiotic-resistant organisms.
Individual use may harm others. Even people who are not exposed to an antibiotic can suffer the consequences of how others use them.
In a retrospective cohort study, Freedberg et al1 analyzed the risk of hospitalized patients developing Clostridium difficile infection and found that the risk was higher if the previous occupant of the bed had received antibiotics. The putative mechanism is that a patient receiving antibiotics develops altered gut flora, leading to C difficile spores released into the environment and not eradicated by normal cleaning. The next patient using the bed is then exposed and infected.
ANTIBIOTIC USE IS HIGH
The US Centers for Disease Control (CDC) monitors antibiotic prescriptions throughout the United States. In the outpatient setting, enough antibiotics are prescribed nationwide for 5 out of every 6 people to get 1 course of antibiotics annually (835 prescriptions per 1,000 people). Rates vary widely among states, with the lowest rate in Alaska (501 prescriptions per 1,000 people) and the highest in West Virginia (1,285 prescriptions per 1,000 people).2 In comparison, Scandinavian countries prescribe about 400 courses per 1,000 people, about 20% less than our lowest-prescribing state.3
Antibiotics are probably the most frequently prescribed drugs in US hospitals. Data from 2006 to 2012 showed that 55% of hospitalized patients received at least 1 dose of an antibiotic and that overall about 75% of all hospital days involved an antibiotic.4 Rates did not vary by hospital size, but nonteaching hospitals tended to use antibiotics more than teaching hospitals. Antibiotic use is much more common in intensive care units than in hospital wards (1,092 and 720 days of antibiotic treatment per 1,000 patient-days, respectively).
Although overall antibiotic use did not change significantly over the years of the survey, use patterns did: fluoroquinolone use dropped by 20%, possibly reflecting rising resistance or increased attention to associated side effects (although fluoroquinolones remain the most widely prescribed inpatient antibiotic class), and use of first-generation cephalosporins fell by 7%. A cause for concern is that the use of broad-spectrum and “last-resort” antibiotics increased: carbapenem use by 37%, vancomycin use by 32%, beta-lactam/beta-lactamase inhibitor use by 26%, and third- and fourth-generation cephalosporin use by 12%.4
About one-third of use is unnecessary
Many studies have tried to measure the extent of inappropriate or unnecessary antibiotic use. The results have been remarkably consistent at 20% to 40% for both inpatient and outpatient studies. One study of hospitalized patients not in the intensive care unit found that 30% of 1,941 days of prescribed antimicrobial therapy were unnecessary, mostly because patients received antibiotics for longer than needed or because antibiotics were used to treat noninfectious syndromes or colonizing microorganisms.5
ANTIBIOTIC EXPOSURE HAS NEGATIVE CONSEQUENCES
Any exposure to a medication involves the potential for side effects; this is true for antibiotics whether or not their use is appropriate. An estimated 140,000 visits to emergency departments occur annually for adverse reactions to antibiotics.6 In hospitalized patients, these reactions can be severe, including renal and bone marrow toxicity. As with any medications, the risks and benefits of antibiotic therapy must be weighed patient by patient.
Disturbance of gut microbiome
Antibiotics’ disruptive effects on normal gut flora are becoming better understood and are even believed to increase the risk of obesity and asthma.7,8
Animal models provide evidence that altered flora is associated with sepsis, which is attributed to the gut microbiome’s role in containing dissemination of bacteria in the body.9 An ecological study provides further evidence. Baggs et al10 retrospectively studied more than 9 million patients discharged without sepsis from 473 US hospitals, of whom 0.6% were readmitted for sepsis within 90 days. Exposure to a broad-spectrum antibiotic was associated with a 50% increased risk of readmission within 90 days of discharge because of sepsis (odds ratio 1.50, 95% confidence interval 1.47–1.53).
Increase of C difficile infections
Antibiotics exert selective pressure, killing susceptible bacteria and allowing resistant bacteria to thrive.
The risk of C difficile infection is 7 to 10 times higher than at baseline for 1 month after antibiotic use and 3 times higher than baseline in the 2 months after that.11 Multiple studies have found that stewardship efforts to reduce antibiotic use have resulted in fewer C difficile infections.
A nationwide effort in England over the past decade to reduce C difficile infections has resulted in 50% less use of fluoroquinolones and third-generation cephalosporins in patients over age 65. During that time, the incidence of C difficile infection in that age group fell by about 70%, with concomitant reductions in mortality and colectomy associated with infection. No increase in rates of hospital admissions, infection complications, or death were observed.12–14
GOAL: BETTER CARE (NOT CHEAPER CARE OR LESS ANTIBIOTIC USE)
The primary goal of antibiotic stewardship is better patient care. The goal is not reduced antibiotic use or cost savings, although these could be viewed as favorable side effects. Sometimes, better patient care involves using more antibiotics: eg, a patient with presumed sepsis should be started quickly on broad-spectrum antibiotics, an action that also falls under antibiotic stewardship. The focus for stewardship efforts should be on optimizing appropriate use, ie, promoting the use of the right agent at the correct dosage and for the proper duration.
Stewardship improves clinical outcomes
Antibiotic stewardship is important not only to society but to individual patients.
Singh et al15 randomized patients suspected of having ventilator-associated pneumonia (but with a low likelihood of pneumonia) to either a 3-day course of ciprofloxacin or standard care (antibiotics for 10 to 21 days, with the drug and duration chosen by the treating physician). After 3 days, the patients in the experimental group were reevaluated, and antibiotics were stopped if the likelihood of pneumonia was still deemed low. In patients who received only the short course of antibiotics, mean length of stay in the intensive care unit was 9 days and the risk of acquiring an antibiotic-resistant superinfection during hospitalization was 14%, compared with a 15-day length of stay and 38% risk of antibiotic-resistant superinfection in patients in the standard treatment group.
Fishman16 reported a study at a single hospital that randomized patients to either receive standard care according to physician choice or be treated according to an antibiotic stewardship program. Patients in the antibiotic stewardship group were almost 3 times more likely than controls to receive appropriate therapy according to guidelines. More important, the antibiotic stewardship patients were almost twice as likely to be cured of their infection and were more than 80% less likely to have treatment failure.
DEVELOPING EFFECTIVE ANTIBIOTIC STEWARDSHIP PROGRAMS
A good model for improving antibiotic use is a recent nationwide program designed to reduce central line-associated bloodstream infections.17 Rates of these infections have dropped by about 50% over the past 5 years. The program included:
- Research to better understand the problem and how to fight it
- Well-defined programs and interventions
- Education to implement interventions, eg, deploying teams to teach better techniques of inserting and maintaining central lines
- A strong national measurement system (the CDC’s National Healthcare Safety Network) to track infections.
What constitutes an antibiotic stewardship program?
The CDC examined successful stewardship programs in a variety of hospital types, including large academic hospitals and smaller hospitals, and identified 7 common core elements that could serve as general principles that were common to successful antibiotic stewardship programs18:
- Leadership commitment from administration
- A single leader responsible for outcomes
- A single pharmacy leader
- Tracking of antibiotic use
- Regular reporting of antibiotic use and resistance
- Educating providers on use and resistance
- Specific improvement interventions.
Stewardship is harder in some settings
In reply to a CDC survey in 2014, 41% of more than 4,000 hospitals reported that they had antibiotic stewardship programs with all 7 core elements. The single element that predicted whether a complete program was in place was leadership support.19 The following year, 48% of respondents reported that they had a complete program in place. Percentages varied among states, with highs in Utah (77%) and California (70%) and lows in North Dakota (12%) and Vermont (7%). Large hospitals and major teaching hospitals were more likely to have a program with all 7 elements: 31% of hospitals with 50 or fewer beds had a complete program vs 66% of hospitals with at least 200 beds.20
Short-stay, critical-access hospitals pose a special challenge, as only 26% reported having all core elements.19,20 These facilities have fewer than 25 beds, and many patient stays are less than 3 days. Some do not employ full-time pharmacists or full-time clinicians. The CDC is collaborating with the American Hospital Association and the Pew Charitable Trusts to focus efforts on helping these hospitals, which requires a more flexible approach. About 100 critical-access hospitals nationwide have reported implementing all of the core elements and can serve as models for the others.
MEASURING IMPROVEMENT
The CDC has adopted a 3-pronged approach to measuring improvements in hospital antibiotic use:
- Estimate national aggregate antibiotic use described above
- Acquire information on antibiotic use at facility, practice, and provider levels
- Assess appropriate antibiotic use.
In hospitals, the CDC has concentrated on facility-level measurement. Hospitals need a system to track their own use and compare it with that of similar facilities. The CDC’s monitoring program, the Antibiotic Use Option of the National Healthcare Safety Network, captures electronic data on antibiotic use in a facility, enabling monitoring of use in each unit. Data can also be aggregated at regional, state, and national levels. This information can be used to develop benchmarks for antibiotic use, so that similar hospitals can be compared.
What is the ‘right’ amount of antibiotic use? Enter SAAR
Creating benchmarks for antibiotic use poses a number of challenges compared with most other areas in healthcare. Most public health measures are binary—eg, people either get an infection, a vaccination, or a smoking cessation intervention or not—and the direction of progress is clear. Antibiotics are different: not everybody needs them, but some people do. Usage should be reduced, but by exactly how much is unclear and varies between hospitals. In addition, being an outlier does not necessarily indicate a problem: a hospital unit for organ transplants will have high rates of antibiotic use, which is likely appropriate.
The CDC has taken initial steps to develop a risk-adjusted benchmark measure for hospital antibiotic use, the Standardized Antimicrobial Administration Ratio (SAAR). It compares a hospital’s observed antibiotic use with a calculation of predicted use based on its facility characteristics. Although still at an early stage, SAAR has been released and has been endorsed by the National Quality Forum. About 200 hospitals are submitting data to the CDC and collaborating with the CDC to evaluate the SAAR’s utility in driving improved antibiotic use.
Problems in measuring appropriate use
Measuring appropriate antibiotic use is easier in the outpatient setting, where detailed data have been collected for many years.
Fleming-Dutra et al21 compared medications prescribed during outpatient visits and the diagnoses coded for the visits. They found that about 13% of all outpatient visits resulted in an antibiotic prescription, 30% of which had no listed diagnosis that would justify an antibiotic (eg, viral upper respiratory infection). This kind of information provides a target for stewardship programs.
It is more difficult to conduct such a study in a hospital setting. Simply comparing discharge diagnoses to antibiotics prescribed is not useful: often antibiotics are started presumptively on admission for a patient with signs and symptoms of an infection, then stopped if the diagnosis does not warrant antibiotics, which is a reasonable strategy.
Also, many times, a patient with asymptomatic bacteriuria, which does not warrant antibiotics, is misdiagnosed as having a urinary tract infection, which does. So simply looking at the discharge code may not reveal whether therapy was appropriate.
Some studies have provided useful information. Fridkin et al22 studied 36 hospitals for the use of vancomycin, which is an especially good candidate drug for study because guidelines exist for appropriate use. Data were collected only from patients given vancomycin for more than 3 days, which should have eliminated empiric use of the drug and included only pathogen-driven therapy. Cases where therapy was for skin and soft-tissue infections were excluded because cultures are not usually obtained for these cases. Of patients given vancomycin, 9% had no diagnostic culture obtained at antibiotic initiation, 22% had diagnostic culture but results showed no gram-positive bacterial growth, and 5% had culture results revealing only oxacillin-susceptible Staphylococcus aureus. In 36% of cases, opportunities existed for improved prescribing.
Such data could be collected from the electronic medical record, and the CDC is focusing efforts in this direction.
NATIONAL ACTIVITIES IN ANTIBIOTIC STEWARDSHIP
In 2014, the White House launched a national strategy to combat antibiotic resistance,23 followed by an action plan in 2015.24 As a result, new investments have been made to improve antibiotic use, including funding for state health departments to begin stewardship efforts and to expand public awareness of the problems of antibiotic overuse. Research efforts are also being funded to improve implementation of existing stewardship practices and to develop new ones.
CMS is also exploring how to drive improved antibiotic use. In October 2016, it started requiring all US nursing homes to have antibiotic stewardship programs, and a similar requirement for hospitals has been proposed.
The Joint Commission issued a standard requiring that all their accredited facilities, starting with hospitals, have an antibiotic stewardship program by January 2017. This standard requires implementation of all the CDC’s core elements.
PROVEN INTERVENTIONS
Focusing on key interventions that are likely to be effective and well received by providers is a useful strategy for antibiotic stewardship efforts. A number of such interventions have been supported by research.
Postprescription antibiotic reviews or antibiotic ‘time-outs’
Antibiotics are often started empirically to treat hospitalized patients suspected of having an infection. The need for the antibiotic should be assessed a few days later, when culture results and more clinical information are available.
Elligsen et al25 evaluated the effects of providing a formal review and suggestions for antimicrobial optimization to critical care teams of 3 intensive care units in a single hospital after 3 and 10 days of antibiotic therapy. Mean monthly antibiotic use decreased from 644 days of therapy per 1,000 patient-days in the preintervention period to 503 days of therapy per 1,000 patient-days (P < .0001). C difficile infections were reduced from 11 cases to 6. Overall gram-negative susceptibility to meropenem increased in the critical care units.
Targeting specific infections
Some infections are especially important to target with improvement efforts.
In 2011, Magill et al26 conducted 1-day prevalence surveys in 183 hospitals in 10 states to examine patterns of antibiotic use. They found that lower respiratory tract infections and urinary tract infections accounted for more than half of all antibiotic use (35% and 22%, respectively), making them good candidates for improved use.
Community-acquired pneumonia can be targeted at multiple fronts. One study showed that almost 30% of patients diagnosed with community-acquired pneumonia in the emergency department did not actually have pneumonia.27 Duration of antibiotic therapy could also be targeted. Guidelines recommend that most patients with uncomplicated community-acquired pneumonia receive 5 to 7 days of antibiotic therapy. Avdic et al28 performed a simple intervention involving education and feedback to teams in 1 hospital regarding antibiotic choice and duration. This resulted in reducing the duration of therapy for community-acquired pneumonia from a median of 10 to 7 days.
Asymptomatic bacteriuria is often misdiagnosed as a urinary tract infection and treated unnecessarily.29–31
Trautner et al32 addressed this problem by targeting urine cultures rather than antibiotics, using a simple algorithm: if a patient did not have symptoms of urinary tract infection (fever, acute hematuria, delirium, rigors, flank pain, pelvic discomfort, urgency, frequency, dysuria, suprapubic pain), a urine culture was not recommended. If a patient did have symptoms but a problem other than urinary tract infection was deemed likely, evaluation of other sources of infection was recommended. Use of the algorithm resulted in fewer urine cultures and less antibiotic overtreatment of asymptomatic bacteriuria. Reductions persisted after the intervention ended.
Antibiotic time-out at hospital discharge
Another study evaluated an intervention that required a pharmacist consultation for the critical care team when a patient was to be discharged with intravenous antibiotics (most often for pneumonia). In 28% of cases, chart review revealed that the infection had been completely treated at the time of discharge, so further antibiotic treatment was not indicated. No patients who avoided antibiotics at discharge were readmitted or subsequently visited the emergency department.33
Targeting outpatient settings
A number of studies have evaluated simple interventions to improve outpatient antibiotic prescribing. Meeker et al34 had providers place a poster in their examination rooms with a picture of the physician and a signed letter committing to the appropriate use of antibiotics. Inappropriate antibiotic use decreased 20% in the intervention group vs controls (P = .02).
In a subsequent study,35 the same group required providers to include a justification note in the electronic medical record every time an antibiotic was prescribed for an indication when guidelines do not recommend one. Inappropriate prescribing dropped from 23% to 5% (P < .001).
Another intervention in this study35 provided physicians with periodic feedback according to whether their therapy was concordant with guidelines. They received an email with a subject line of either “You are a top performer” or “You are not a top performer.” The contents of the email provided data on how many antibiotic prescriptions they wrote for conditions that did not warrant them and how their prescribing habits compared with those of their top-performing peers. Mean inappropriate antibiotic prescribing fell from 20% to 4%.35
This is a critical time for antibiotic stewardship efforts in the United States. The need has never been more urgent and, fortunately, the opportunities have never been more abundant. Requirements for stewardship programs will drive implementation, but hospitals will need support and guidance to help ensure that stewardship programs are as effective as possible. Ultimately, improving antibiotic use will require collaboration among all stakeholders. CDC is eager to partner with providers and others in their efforts to improve antibiotic use.
- Freedberg DE, Salmasian H, Cohen B, Abrams JA, Larson EL. Receipt of antibiotics in hospitalized patients and risk for Clostridium difficile infection in subsequent patients who occupy the same bed. JAMA Intern Med 2016; 176:1801–1808.
- Centers for Disease Control and Prevention. Get smart: know when antibiotics work. Measuring outpatient antibiotic prescribing. https://www.cdc.gov/getsmart/community/programs-measurement/measuring-antibiotic-prescribing.html. Accessed February 5, 2017.
- Ternhag A, Hellman J. More on U.S. outpatient antibiotic prescribing, 2010. N Engl J Med 2013; 369:1175–1176.
- Baggs J, Fridkin SK, Pollack LA, Srinivasan A, Jernigan JA. Estimating national trends in inpatient antibiotic use among US hospitals from 2006 to 2012. JAMA Intern Med 2016; 176:1639–1648.
- Hecker MT, Aron DC, Patel NP, Lehmann MK, Donskey CJ. Unnecessary use of antimicrobials in hospitalized patients: current patterns of misuse with an emphasis on the antianaerobic spectrum of activity. Arch Intern Med 2003; 163:972–978.
- Shehab N, Patel PR, Srinivasan A, Budnitz DS. Emergency department visits for antibiotic-associated adverse events. Clin Infect Dis 2008; 47:735–743.
- Korpela K, de Vos WM. Antibiotic use in childhood alters the gut microbiota and predisposes to overweight. Microb Cell 2016; 3:296–298.
- Gray LE, O’Hely M, Ranganathan S, Sly PD, Vuillermin P. The maternal diet, gut bacteria, and bacterial metabolites during pregnancy influence offspring asthma. Front Immunol 2017; 8:365
- Haak BW, Wiersinga WJ. The role of the gut microbiota in sepsis. Lancet Gastroenterol Hepatol 2017; 2:135–143.
- Baggs J, Jernigan J, Mccormick K, Epstein L, Laufer-Halpin AS, Mcdonald C. Increased risk of sepsis during hospital readmission following exposure to certain antibiotics during hospitalization. Abstract presented at IDWeek, October 26-30, 2016, New Orleans, LA. https://idsa.confex.com/idsa/2016/webprogram/Paper58587.html. Accessed August 8, 2017.
- Hensgens MP, Goorhuis A, Dekkers OM, Kuijper EJ. Time interval of increased risk for Clostridium difficile infection after exposure to antibiotics. J Antimicrob Chemother 2012; 67:742–748.
- Ashiru-Oredope D, Sharland M, Charani E, McNulty C, Cooke J; ARHAI Antimicrobial Stewardship Group. Improving the quality of antibiotic prescribing in the NHS by developing a new antimicrobial stewardship programme: Start Smart—Then Focus. J Antimicrob Chemother 2012; 67(suppl 1):i51–i63.
- Wilcox MH, Shetty N, Fawley WN, et al. Changing epidemiology of Clostridium difficile infection following the introduction of a national ribotyping-based surveillance scheme in England. Clin Infect Dis 2012; 55:1056-1063.
- Public Health England. Clostridium difficile infection: monthly data by NHS acute trust. https://www.gov.uk/government/statistics/clostridium-difficile-infection-monthly-data-by-nhs-acute-trust. Accessed August 4, 2017.
- Singh N, Rogers P, Atwood CW, Wagener MM, Yu VL. Short-course empiric antibiotic therapy for patients with pulmonary infiltrates in the intensive care unit. A proposed solution for indiscriminate antibiotic prescription. Am J Respir Crit Care Med 2000; 162:505–511.
- Fishman N. Antimicrobial stewardship. Am J Med 2006; 119:S53–S61.
- US Centers for Disease Control and Prevention. Healthcare-associated infections (HAI) progress report. https://www.cdc.gov/hai/surveillance/progress-report/index.html. Accessed August 4, 2017.
- US Centers for Disease Control and Prevention. Get Smart for Healthcare. Core elements of hospital antibiotic stewardship programs. https://www.cdc.gov/getsmart/healthcare/implementation/core-elements.html. Accessed August 8, 2017.
- Pollack LA, van Santen KL, Weiner LM, Dudeck MA, Edwards JR, Srinivasan A. Antibiotic stewardship programs in U.S. acute care hospitals: findings from the 2014 National Healthcare Safety Network Annual Hospital Survey. Clin Infect Dis 2016; 63:443–449.
- US Centers for Disease Control and Prevention. Antibiotic stewardship in acute care hospitals by state 2014. https://gis.cdc.gov/grasp/PSA/STMapView.html. Accessed August 4, 2017.
- Fleming-Dutra KE, Hersh AL, Shapiro DJ, et al. Prevalence of inappropriate antibiotic prescriptions among US ambulatory care visits, 2010–2011. JAMA 2016; 315:1864-1873.
- Fridkin S, Baggs J, Fagan R, et al; Centers for Disease Control and Prevention (CDC). Vital signs: improving antibiotic use among hospitalized patients. MMWR Morb Mortal Wkly Rep 2014; 63:194–200.
- National strategy for combating antibiotic-resistant bacteria. September 2014. https://www.whitehouse.gov/sites/default/files/docs/carb_national_strategy.pdf. Accessed August 9, 2017.
- National action plan for combating antibiotic-resistant Bacteria. March 2015. https://www.whitehouse.gov/sites/default/files/docs/national_action_plan_for_combating_antibotic-resistant_bacteria.pdf
- Elligsen M, Walker SA, Pinto R, et al. Audit and feedback to reduce broad-spectrum antibiotic use among intensive care unit patients: a controlled interrupted time series analysis. Infect Control Hosp Epidemiol 2012; 33:354–361.
- Magill SS, Edwards JR, Beldavs ZG, et al; Emerging Infections Program Healthcare-Associated Infections and Antimicrobial Use Prevalence Survey Team. Prevalence of antimicrobial use in US acute care hospitals, May-September 2011. JAMA 2014; 312:1438-1446.
- Chandra A, Nicks B, Maniago E, Nouh A, Limkakeng A. A multicenter analysis of the ED diagnosis of pneumonia. Am J Emerg Med 2010;28:862–865.
- Avdic E, Cushinotto LA, Hughes AH, et al. Impact of an antimicrobial stewardship intervention on shortening the duration of therapy for community-acquired pneumonia. Clin Infect Dis 2012; 54:1581–1587.
- Dalen DM, Zvonar RK, Jessamine PG, et al. An evaluation of the management of asymptomatic catheter-associated bacteriuria and candiduria at the Ottawa Hospital. Can J Infect Dis Med Micribiol 2005; 16:166–170.
- Gandhi T, Flanders SA, Markovitz E, Saint S, Kaul DR. Importance of urinary tract infection to antibiotic use among hospitalized patients. Infect Control Hosp Epidemiol 2009; 30:193–195.
- Cope M, Cevallos ME, Cadle RM, Darouiche RO, Musher DM, Trautner BW. Inappropriate treatment of catheter-associated asymptomatic bacteriuria in a tertiary care hospital. Clin Infect Dis 2009; 48:1182–1188.
- Trautner BW, Grigoryan L, Petersen NJ, et al. Effectiveness of an antimicrobial stewardship approach for urinary catheter-associated asymptomatic bacteriuria. JAMA Intern Med 2015; 175:1120–1127.
- Shrestha NK, Bhaskaran A, Scalera NM, Schmitt SK, Rehm SJ, Gordon SM. Antimicrobial stewardship at transition of care from hospital to community. Infect Control Hosp Epidemiol 2012; 33:401–404.
- Meeker D, Knight TK, Friedberg MW, et al. Nudging guideline-concordant antibiotic prescribing: a randomized clinical trial. JAMA Intern Med 2014; 174:425–431.
- Meeker D, Linder JA, Fox CR, et al. Effect of behavioral interventions on inappropriate antibiotic prescribing among primary care practices: a randomized clinical trial. JAMA 2016; 315:562–572.
- Freedberg DE, Salmasian H, Cohen B, Abrams JA, Larson EL. Receipt of antibiotics in hospitalized patients and risk for Clostridium difficile infection in subsequent patients who occupy the same bed. JAMA Intern Med 2016; 176:1801–1808.
- Centers for Disease Control and Prevention. Get smart: know when antibiotics work. Measuring outpatient antibiotic prescribing. https://www.cdc.gov/getsmart/community/programs-measurement/measuring-antibiotic-prescribing.html. Accessed February 5, 2017.
- Ternhag A, Hellman J. More on U.S. outpatient antibiotic prescribing, 2010. N Engl J Med 2013; 369:1175–1176.
- Baggs J, Fridkin SK, Pollack LA, Srinivasan A, Jernigan JA. Estimating national trends in inpatient antibiotic use among US hospitals from 2006 to 2012. JAMA Intern Med 2016; 176:1639–1648.
- Hecker MT, Aron DC, Patel NP, Lehmann MK, Donskey CJ. Unnecessary use of antimicrobials in hospitalized patients: current patterns of misuse with an emphasis on the antianaerobic spectrum of activity. Arch Intern Med 2003; 163:972–978.
- Shehab N, Patel PR, Srinivasan A, Budnitz DS. Emergency department visits for antibiotic-associated adverse events. Clin Infect Dis 2008; 47:735–743.
- Korpela K, de Vos WM. Antibiotic use in childhood alters the gut microbiota and predisposes to overweight. Microb Cell 2016; 3:296–298.
- Gray LE, O’Hely M, Ranganathan S, Sly PD, Vuillermin P. The maternal diet, gut bacteria, and bacterial metabolites during pregnancy influence offspring asthma. Front Immunol 2017; 8:365
- Haak BW, Wiersinga WJ. The role of the gut microbiota in sepsis. Lancet Gastroenterol Hepatol 2017; 2:135–143.
- Baggs J, Jernigan J, Mccormick K, Epstein L, Laufer-Halpin AS, Mcdonald C. Increased risk of sepsis during hospital readmission following exposure to certain antibiotics during hospitalization. Abstract presented at IDWeek, October 26-30, 2016, New Orleans, LA. https://idsa.confex.com/idsa/2016/webprogram/Paper58587.html. Accessed August 8, 2017.
- Hensgens MP, Goorhuis A, Dekkers OM, Kuijper EJ. Time interval of increased risk for Clostridium difficile infection after exposure to antibiotics. J Antimicrob Chemother 2012; 67:742–748.
- Ashiru-Oredope D, Sharland M, Charani E, McNulty C, Cooke J; ARHAI Antimicrobial Stewardship Group. Improving the quality of antibiotic prescribing in the NHS by developing a new antimicrobial stewardship programme: Start Smart—Then Focus. J Antimicrob Chemother 2012; 67(suppl 1):i51–i63.
- Wilcox MH, Shetty N, Fawley WN, et al. Changing epidemiology of Clostridium difficile infection following the introduction of a national ribotyping-based surveillance scheme in England. Clin Infect Dis 2012; 55:1056-1063.
- Public Health England. Clostridium difficile infection: monthly data by NHS acute trust. https://www.gov.uk/government/statistics/clostridium-difficile-infection-monthly-data-by-nhs-acute-trust. Accessed August 4, 2017.
- Singh N, Rogers P, Atwood CW, Wagener MM, Yu VL. Short-course empiric antibiotic therapy for patients with pulmonary infiltrates in the intensive care unit. A proposed solution for indiscriminate antibiotic prescription. Am J Respir Crit Care Med 2000; 162:505–511.
- Fishman N. Antimicrobial stewardship. Am J Med 2006; 119:S53–S61.
- US Centers for Disease Control and Prevention. Healthcare-associated infections (HAI) progress report. https://www.cdc.gov/hai/surveillance/progress-report/index.html. Accessed August 4, 2017.
- US Centers for Disease Control and Prevention. Get Smart for Healthcare. Core elements of hospital antibiotic stewardship programs. https://www.cdc.gov/getsmart/healthcare/implementation/core-elements.html. Accessed August 8, 2017.
- Pollack LA, van Santen KL, Weiner LM, Dudeck MA, Edwards JR, Srinivasan A. Antibiotic stewardship programs in U.S. acute care hospitals: findings from the 2014 National Healthcare Safety Network Annual Hospital Survey. Clin Infect Dis 2016; 63:443–449.
- US Centers for Disease Control and Prevention. Antibiotic stewardship in acute care hospitals by state 2014. https://gis.cdc.gov/grasp/PSA/STMapView.html. Accessed August 4, 2017.
- Fleming-Dutra KE, Hersh AL, Shapiro DJ, et al. Prevalence of inappropriate antibiotic prescriptions among US ambulatory care visits, 2010–2011. JAMA 2016; 315:1864-1873.
- Fridkin S, Baggs J, Fagan R, et al; Centers for Disease Control and Prevention (CDC). Vital signs: improving antibiotic use among hospitalized patients. MMWR Morb Mortal Wkly Rep 2014; 63:194–200.
- National strategy for combating antibiotic-resistant bacteria. September 2014. https://www.whitehouse.gov/sites/default/files/docs/carb_national_strategy.pdf. Accessed August 9, 2017.
- National action plan for combating antibiotic-resistant Bacteria. March 2015. https://www.whitehouse.gov/sites/default/files/docs/national_action_plan_for_combating_antibotic-resistant_bacteria.pdf
- Elligsen M, Walker SA, Pinto R, et al. Audit and feedback to reduce broad-spectrum antibiotic use among intensive care unit patients: a controlled interrupted time series analysis. Infect Control Hosp Epidemiol 2012; 33:354–361.
- Magill SS, Edwards JR, Beldavs ZG, et al; Emerging Infections Program Healthcare-Associated Infections and Antimicrobial Use Prevalence Survey Team. Prevalence of antimicrobial use in US acute care hospitals, May-September 2011. JAMA 2014; 312:1438-1446.
- Chandra A, Nicks B, Maniago E, Nouh A, Limkakeng A. A multicenter analysis of the ED diagnosis of pneumonia. Am J Emerg Med 2010;28:862–865.
- Avdic E, Cushinotto LA, Hughes AH, et al. Impact of an antimicrobial stewardship intervention on shortening the duration of therapy for community-acquired pneumonia. Clin Infect Dis 2012; 54:1581–1587.
- Dalen DM, Zvonar RK, Jessamine PG, et al. An evaluation of the management of asymptomatic catheter-associated bacteriuria and candiduria at the Ottawa Hospital. Can J Infect Dis Med Micribiol 2005; 16:166–170.
- Gandhi T, Flanders SA, Markovitz E, Saint S, Kaul DR. Importance of urinary tract infection to antibiotic use among hospitalized patients. Infect Control Hosp Epidemiol 2009; 30:193–195.
- Cope M, Cevallos ME, Cadle RM, Darouiche RO, Musher DM, Trautner BW. Inappropriate treatment of catheter-associated asymptomatic bacteriuria in a tertiary care hospital. Clin Infect Dis 2009; 48:1182–1188.
- Trautner BW, Grigoryan L, Petersen NJ, et al. Effectiveness of an antimicrobial stewardship approach for urinary catheter-associated asymptomatic bacteriuria. JAMA Intern Med 2015; 175:1120–1127.
- Shrestha NK, Bhaskaran A, Scalera NM, Schmitt SK, Rehm SJ, Gordon SM. Antimicrobial stewardship at transition of care from hospital to community. Infect Control Hosp Epidemiol 2012; 33:401–404.
- Meeker D, Knight TK, Friedberg MW, et al. Nudging guideline-concordant antibiotic prescribing: a randomized clinical trial. JAMA Intern Med 2014; 174:425–431.
- Meeker D, Linder JA, Fox CR, et al. Effect of behavioral interventions on inappropriate antibiotic prescribing among primary care practices: a randomized clinical trial. JAMA 2016; 315:562–572.
KEY POINTS
- Antibiotics are fundamentally different from other medications, posing special challenges and needs for improving their use.
- Antibiotic usage in the United States varies widely among healthcare settings.
- Antibiotic stewardship efforts should focus on optimizing appropriate use rather than simply reducing use.
- Effective interventions include timely consultation on appropriate prescribing, targeting specific infections, and providing feedback to physicians.
Cardiogenic shock: From ECMO to Impella and beyond
A 43-year-old man presented to a community hospital with acute chest pain and shortness of breath and was diagnosed with anterior ST-elevation myocardial infarction. He was a smoker with a history of alcohol abuse, hypertension, and hyperlipidemia, and in the past he had undergone percutaneous coronary interventions to the right coronary artery and the first obtuse marginal artery.
Angiography showed total occlusion in the left anterior descending artery, 90% stenosis in the right coronary artery, and mild disease in the left circumflex artery. A drug-eluting stent was placed in the left anterior descending artery, resulting in good blood flow.
However, his left ventricle continued to have severe dysfunction. An intra-aortic balloon pump was inserted. Afterward, computed tomography showed subsegmental pulmonary embolism with congestion. His mean arterial pressure was 60 mm Hg (normal 70–110), central venous pressure 12 mm Hg (3–8), pulmonary artery pressure 38/26 mm Hg (15–30/4–12), pulmonary capillary wedge pressure 24 mm Hg (2–15), and cardiac index 1.4 L/min (2.5–4).
The patient was started on dobutamine and norepinephrine and transferred to Cleveland Clinic on day 2. Over the next day, he had runs of ventricular tachycardia, for which he was given amiodarone and lidocaine. His urine output was low, and his serum creatinine was elevated at 1.65 mg/dL (baseline 1.2, normal 0.5–1.5). Liver function tests were also elevated, with aspartate aminotransferase at 115 U/L(14–40) and alanine aminotransferase at 187 U/L (10–54).
Poor oxygenation was evident: his arterial partial pressure of oxygen was 64 mm Hg (normal 75–100). He was intubated and given 100% oxygen with positive end-expiratory pressure of 12 cm H2O.
Echocardiography showed a left ventricular ejection fraction of 15% (normal 55%–70%) and mild right ventricular dysfunction.
ECMO and then Impella placement
On his third hospital day, a venoarterial extracorporeal membrane oxygenation (ECMO) device was placed peripherally (Figure 1).
His hemodynamic variables stabilized, and he was weaned off dobutamine and norepinephrine. Results of liver function tests normalized, his urinary output increased, and his serum creatinine dropped to a normal 1.0 mg/dL. However, a chest radiograph showed pulmonary congestion, and echocardiography now showed severe left ventricular dysfunction.
On hospital day 5, the patient underwent surgical placement of an Impella 5.0 device (Abiomed, Danvers, MA) through the right axillary artery in an effort to improve his pulmonary edema. The ECMO device was removed. Placement of a venovenous ECMO device was deemed unnecessary when oxygenation improved with the Impella.
Three days after Impella placement, radiography showed improved edema with some remaining pleural effusion.
ACUTE CARDIOGENIC SHOCK
Cardiogenic shock remains a challenging clinical problem: patients with it are among the sickest in the hospital, and many of them die. ECMO was once the only therapy available and is still widely used. However, it is a 2-edged sword; complications such as bleeding, infection, and thrombosis are almost inevitable if it is used for long. Importantly, patients are usually kept intubated and bedridden.
In recent years, new devices have become available that are easier to place (some in the catheterization laboratory or even at the bedside) and allow safer bridging to recovery, transplant, or other therapies.
This case illustrates the natural history of cardiogenic shock and the preferred clinical approach: ie, ongoing evaluation that permits rapid response to evolving challenges.
In general, acute cardiogenic shock occurs within 24 to 48 hours after the initial insult, so even if a procedure succeeds, the patient may develop progressive hypotension and organ dysfunction. Reduced cardiac output causes a downward spiral with multiple systemic and inflammatory processes as well as increased nitric oxide synthesis, leading to progressive decline and eventual end-organ dysfunction.
Continuously evaluate
The cardiac team should continuously assess the acuity and severity of a patient’s condition, with the goals of maintaining end-organ perfusion and identifying the source of problems. Refractory cardiogenic shock, with tissue hypoperfusion despite vasoactive medications and treatment of the underlying cause, is associated with in-hospital mortality rates ranging from 30% to 50%.1,2 The rates have actually increased over the past decade, as sicker patients are being treated.
When a patient presents with cardiogenic shock, we first try a series of vasoactive drugs and usually an intra-aortic balloon pump (Figure 2). We then tailor treatment depending on etiology. For example, a patient may have viral myocarditis and may even require a biopsy.
If cardiogenic shock is refractory, mechanical circulatory support devices can be a short-term bridge to either recovery or a new decision. A multidisciplinary team should be consulted to consider transplant, a long-term device, or palliative care. Sometimes a case requires “bridging to a bridge,” with several devices used short-term in turn.
Prognostic factors in cardiogenic shock
Several tools help predict outcome in a severely ill patient. End-organ function, indicated by blood lactate levels and estimated glomerular filtration rate, is perhaps the most informative and should be monitored serially.
CardShock3 is a simple scoring system based on age, mental status at presentation, laboratory values, and medical history. Patients receive 1 point for each of the following factors:
- Age > 75
- Confusion at presentation
- Previous myocardial infarction or coronary artery bypass grafting
- Acute coronary syndrome etiology
- Left ventricular ejection fraction < 40%
- Blood lactate level between 2 and 4 mmol/L, inclusively (2 points for lactate levels > 4 mmol/L)
- Estimated glomerular filtration rate between 30 and 60 mL/min/1.73 m2, inclusively (2 points if < 30 mL/min/1.73 m2).
Thus, scores range from 0 (best) to 9 (worst). A score of 0 to 3 points was associated with a 9% risk of death in the hospital, a score of 4 or 5 with a risk of 36%, and a score of 6 through 9 with a risk of 77%.3
The Survival After Veno-arterial ECMO (SAVE) score (www.save-score.com) is a prediction tool derived from a large international ECMO registry.4 It is based on patient age, diagnosis, and indicators of end-organ dysfunction. Scores range from –35 (worst) to +7 (best).
The mortality rate associated with postcardiotomy cardiogenic shock increases with the amount of inotropic support provided. In a 1996–1999 case series of patients who underwent open-heart surgery,5 the hospital mortality rate was 40% in those who received 2 inotropes in high doses and 80% in those who received 3. A strategy of early implementation of mechanical support is critical.
Selection criteria for destination therapy
Deciding whether a patient should receive a long-term device is frequently a challenge. The decision often must be based on limited information about not only the medical indications but also psychosocial factors that influence long-term success.
The Centers for Medicare and Medicaid Services have established criteria for candidates for left ventricular assist devices (LVADs) as destination therapy.6 Contraindications established for heart transplant should also be considered (Table 1).
CASE REVISITED
Several factors argued against LVAD placement in our patient. He had no health insurance and had been off medications. He smoked and said he consumed 3 hard liquor drinks per week. His Stanford Integrated Psychosocial Assessment for Transplantation score was 30 (minimally acceptable). He had hypoxia with subsegmental pulmonary edema, a strong contraindication to immediate transplant.
On the other hand, he had only mild right ventricular dysfunction. His CardShock score was 4 (intermediate risk, based on lactate 1.5 mmol/L and estimated glomerular filtration rate 52 mL/min/1.73 m2). His SAVE score was –9 (class IV), which overall is associated with a 30% risk of death (low enough to consider treatment).
During the patient’s time on temporary support, the team had the opportunity to better understand him and assess his family support and his ability to handle a permanent device. His surviving the acute course bolstered the team’s confidence that he could enjoy long-term survival with destination therapy.
CATHETERIZATION LABORATORY DEVICE CAPABILITIES
Although most implantation procedures are done in the operating room, they are often done in the catheterization laboratory because patients undergoing catheterization may not be stable enough for transfer, or an emergency intervention may be required during the night. Catheterization interventionists are also an important part of the team to help determine the best approach for long-term therapy.
The catheterization laboratory has multiple acute intervention options. Usually, decisions must be made quickly. In general, patients needing mechanical support are managed as follows:
- Those who need circulation support and oxygenation receive ECMO
- Those who need circulation support alone because of mechanical issues (eg, myocardial infarction) are considered for an intra-aortic balloon pump, Impella, or TandemHeart pump (Cardiac Assist, Pittsburgh, PA).
Factors that guide the selection of a temporary pump include:
- Left ventricular function
- Right ventricular function
- Aortic valve stenosis (some devices cannot be inserted through critical aortic stenosis)
- Aortic regurgitation (can affect some devices)
- Peripheral artery disease (some devices are large and must be placed percutaneously).
CHOOSING AMONG PERCUTANEOUS DEVICES
Circulatory support in cardiogenic shock improves outcomes, and devices play an important role in supporting high-risk procedures. The goal is not necessarily to use the device throughout the hospital stay. Acute stabilization is most important initially; a more considered decision about long-term therapy can be made when more is known about the patient.
Patient selection is the most important component of success. However, randomized data to support outcomes with the various devices are sparse and complicated by the critically ill state of the patient population.
SHORT-TERM CIRCULATORY SUPPORT: ECMO, IMPELLA, TANDEMHEART
A menu of options is available for temporary mechanical support. Options differ by their degree of circulatory support and ease of insertion (Table 2).
ECMO: A fast option with many advantages
ECMO has evolved and now can be placed quickly. A remote diagnostic platform such as CardioHub permits management at the bedside, in the medical unit, or in the cardiac intensive care unit.7
ECMO has several advantages. It can be used during cardiopulmonary bypass, it provides oxygenation, it is the only option in the setting of lung injury, it can be placed peripherally (without thoracotomy), and it is the only percutaneous option for biventricular support.
ECMO also has significant disadvantages
ECMO is a good device for acute resuscitation of a patient in shock, as it offers quick placement and resuscitation. But it is falling out of favor because of significant disadvantages.
Its major drawback is that it provides no left ventricular unloading. Although in a very unstable patient ECMO can stabilize end organs and restore their function, the lack of left ventricular unloading and reduced ventricular work threaten the myocardium. It creates extremely high afterload; therefore, in a left ventricle with poor function, wall tension and myocardial oxygen demand increase. Multiple studies have shown that coronary perfusion worsens, especially if the patient is cannulated peripherally. Because relative cerebral hypoxia occurs in many situations, it is imperative to check blood saturations at multiple sites to determine if perfusion is adequate everywhere.
Ineffective left ventricular unloading with venoarterial ECMO is managed in several ways. Sometimes left ventricular distention is slight and the effects are subtle. Left ventricular distention causing pulmonary edema can be addressed with:
- Inotropes (in moderate doses)
- Anticoagulation to prevent left ventricular thrombus formation
- An intra-aortic balloon pump. Most patients on ECMO already have an intra-aortic balloon pump in place, and it should be left in to provide additional support. For those who do not have one, it should be placed via the contralateral femoral artery.
If problems persist despite these measures, apical cannulation or left ventricular septostomy can be performed.
Outcomes with ECMO have been disappointing. Studies show that whether ECMO was indicated for cardiac failure or for respiratory failure, survival is only about 25% at 5 years. Analyzing data only for arteriovenous ECMO, survival was 48% in bridged patients and 41% in patients who were weaned.
The Extracorporeal Life Support Organization Registry, in their international summary from 2010, found that 34% of cardiac patients on ECMO survived to discharge or transfer. Most of these patients had cardiogenic shock from acute myocardial infarction. Outcomes are so poor because of complications endemic to ECMO, eg, dialysis-dependent renal failure (about 40%) and neurologic complications (about 30%), often involving ischemic or hemorrhagic stroke.
Limb and pump complications were also significant in the past. These have been reduced with the new reperfusion cannula and the Quadrox oxygenator.
Complications unique to ECMO should be understood and anticipated so that they can be avoided. Better tools are available, ie, Impella and TandemHeart.
Left-sided Impella: A longer-term temporary support
ECMO is a temporary fix that is usually used only for a few days. If longer support is needed, axillary placement of an Impella should be used as a bridge to recovery, transplant, or a durable LVAD.
The Impella device (Figure 3) is a miniature rotary blood pump increasingly used to treat cardiogenic shock. It is inserted retrograde across the aortic valve to provide short-term ventricular support. Most devices are approved by the US Food and Drug Administration (FDA) for less than 7 days of use, but we have experience using them up to 30 days. They are very hemocompatible, involving minimal hemolysis. Axillary placement allows early extubation and ambulation and is more stable than groin placement.
Several models are available: the 2.5 and 3.5 L/min devices can be placed percutaneously, while the 5 L/min model must be surgically placed in the axillary or groin region. Heparin is required with their use. They can replace ECMO. A right ventricular assist device (RVAD), Impella RP, is also available.
Physiologic impact of the Impella
The Impella fully unloads the left ventricle, reducing myocardial oxygen demand and increasing myocardial blood flow. It reduces end-diastolic volume and pressure, the mechanical work of the heart, and wall tension. Microvascular resistance is reduced, allowing increased coronary flow. Cardiac output and power are increased by multiple means.8–11
The RECOVER 1 trial evaluated the 5L Impella placed after cardiac surgery. The cardiac index increased in all the patients, and the systemic vascular resistance and wedge pressure decreased.12
Unloading the ventricle is critical. Meyns and colleagues13 found a fivefold reduction in infarct size from baseline in a left anterior descending occlusion model in pigs after off-loading the ventricle.
Impella has the advantage of simple percutaneous insertion (the 2.5 and CP models). It also tests right ventricular tolerance: if the right ventricle is doing well, one can predict with high certainty that it will tolerate an LVAD (eg, HeartWare, HeartMate 2 (Pleasanton, CA), or HeartMate 3 when available).
Disadvantages include that it provides only left ventricular support, although a right ventricular device can be inserted for dual support. Placement requires fluoroscopic or echocardiographic guidance.
TandemHeart requires septal puncture
The TandemHeart is approved for short-term and biventricular use. It consists of an extracorporeal centrifugal pump that withdraws blood from the left atrium via a trans-septal cannula placed through the femoral vein (Figure 4) and returns it to one or both femoral arteries. The blood is pumped at up to 5 L/min.
It is designed to reduce the pulmonary capillary wedge pressure, ventricular work, and myocardial oxygen demand and increase cardiac output and mean arterial pressure. It has the advantages of percutaneous placement and the ability to provide biventricular support with 2 devices. It can be used for up to 3 weeks. It can easily be converted to ECMO by either splicing in an oxygenator or adding another cannula.
Although the TandemHeart provides significant support, it is no longer often used. A 21F venous cannula must be passed to the left atrium by trans-septal puncture, which requires advanced skill and must be done in the catheterization laboratory. Insertion can take too much time and cause bleeding in patients taking an anticoagulant. Insertion usually destroys the septum, and removal requires a complete patch of the entire septum. Systemic anticoagulation is required. Other disadvantages are risks of hemolysis, limb ischemia, and infection with longer support times.
The CentriMag (Levitronix LLC; Framingham, MA) is an improved device that requires only 1 cannula instead of 2 to cover both areas.
DEVICES FOR RIGHT-SIDED SUPPORT
Most early devices were designed for left-sided support. The right heart, especially in failure, has been more difficult to manage. Previously the only option for a patient with right ventricular failure was venoarterial ECMO. This is more support than needed for a patient with isolated right ventricular failure and involves the risk of multiple complications from the device.
With more options available for the right heart (Table 3), we can choose the most appropriate device according to the underlying cause of right heart failure (eg, right ventricular infarct, pulmonary hypertension), the likelihood of recovery, and the expected time to recovery.
The ideal RVAD would be easy to implant, maintain, and remove. It would allow for chest closure and patient ambulation. It would be durable and biocompatible, so that it could remain implanted for months if necessary. It would cause little blood trauma, have the capability for adding an oxygenator for pulmonary support, and be cost-effective.
Although no single system has all these qualities, each available device fulfills certain combinations of these criteria, so the best one can be selected for each patient’s needs.
ECMO Rotaflow centrifugal pump: Fast, simple, inexpensive
A recent improvement to ECMO is the Rotaflow centrifugal pump (Maquet, Wayne, NJ), which is connected by sewing an 8-mm graft onto the pulmonary artery and placing a venous cannula in the femoral vein. If the patient is not bleeding, the chest can then be closed. This creates a fast, simple, and inexpensive temporary RVAD system. When the patient is ready to be weaned, the outflow graft can be disconnected at the bedside without reopening the chest.
The disadvantage is that the Rotaflow system contains a sapphire bearing. Although it is magnetically coupled, it generates heat and is a nidus for thrombus formation, which can lead to pump failure and embolization. This system can be used for patients who are expected to need support for less than 5 to 7 days. Beyond this duration, the incidence of complications increases.
CentriMag Ventricular Assist System offers right, left, or bilateral support
The CentriMag Ventricular Assist System is a fully magnetically levitated pump containing no bearings or seals, and with the same technology as is found in many of the durable devices such as HeartMate 3. It is coupled with a reusable motor and is easy to use.
CentriMag offers versatility, allowing for right, left, or bilateral ventricular support. An oxygenator can be added for pulmonary edema and additional support. It is the most biocompatible device and is FDA-approved for use for 4 weeks, although it has been used successfully for much longer. It allows for chest closure and ambulation. It is especially important as a bridge to transplant. The main disadvantage is that insertion and removal require sternotomy.
Impella RP: One size does not fit all
The Impella RP (Figure 5) has an 11F catheter diameter, 23F pump, and a maximum flow rate of more than 4 L/minute. It has a unique 3-dimensional cannula design based on computed tomography 3-dimensional reconstructions from hundreds of patients.
The device is biocompatible and can be used for support for more than 7 days, although most patients require only 3 or 4 days. There is almost no priming volume, so there is no hemodilution.
The disadvantages are that it is more challenging to place than other devices, and some patients cannot use it because the cannula does not fit. It also does not provide pulmonary support. Finally, it is the most expensive of the 3 right-sided devices.
CASE REVISITED
The patient described at the beginning of this article was extubated on day 12 but was then reintubated. On day 20, a tracheotomy tube was placed. By day 24, he had improved so little that his family signed a “do-not-resuscitate–comfort-care-arrest” order (ie, if the patient’s heart or breathing stops, only comfort care is to be provided).
But slowly he got better, and the Impella was removed on day 30. Afterward, serum creatinine and liver function tests began rising again, requiring dobutamine for heart support.
On day 34, his family reversed the do-not-resuscitate order, and he was reevaluated for an LVAD as destination therapy. At this point, echocardiography showed a left ventricular ejection fraction of 10%, normal right ventricular function, with a normal heartbeat and valves. On day 47, a HeartMate II LVAD was placed.
On postoperative day 18, he was transferred out of the intensive care unit, then discharged to an acute rehabilitation facility 8 days later (hospital day 73). He was subsequently discharged.
At a recent follow-up appointment, the patient said that he was feeling “pretty good” and walked with no shortness of breath.
- Reyentovich A, Barghash MH, Hochman JS. Management of refractory cardiogenic shock. Nat Rev Cardiol 2016; 13:481–492.
- Wayangankar SA, Bangalore S, McCoy LA, et al. Temporal trends and outcomes of patients undergoing percutaneous coronary interventions for cardiogenic shock in the setting of acute myocardial infarction: a report from the CathPCI registry. JACC Cardiovasc Interv 2016; 9:341–351.
- Harjola VP, Lassus J, Sionis A, et al; CardShock Study Investigators; GREAT network. Clinical picture and risk prediction of short-term mortality in cardiogenic shock. Eur J Heart Fail 2015; 17:501–509.
- Schmidt M, Burrell A, Roberts L, et al. Predicting survival after ECMO for refractory cardiogenic shock: the survival after veno-arterial-ECMO (SAVE)-score. Eur Heart J 2015; 36:2246–2256.
- Samuels LE, Kaufman MS, Thomas MP, Holmes EC, Brockman SK, Wechsler AS. Pharmacological criteria for ventricular assist device insertion following postcardiotomy shock: experience with the Abiomed BVS system. J Card Surg 1999; 14:288–293.
- Centers for Medicare & Medicaid Services. Decision memo for ventricular assist devices as destination therapy (CAG-00119R2). www.cms.gov/medicare-coverage-database/details/nca-decision-memo.aspx?NCAId=243&ver=9&NcaName=Ventricular+Assist+Devices+as+Destination+Therapy+(2nd+Recon)&bc=BEAAAAAAEAAA&&fromdb=true. Accessed March 10, 2017.
- Kulkarni T, Sharma NS, Diaz-Guzman E. Extracorporeal membrane oxygenation in adults: a practical guide for internists. Cleve Clin J Med 2016; 83:373–384.
- Remmelink M, Sjauw KD, Henriques JP, et al. Effects of left ventricular unloading by Impella Recover LP2.5 on coronary hemodynamics. Catheter Cardiovasc Interv 2007; 70:532–537.
- Aqel RA, Hage FG, Iskandrian AE. Improvement of myocardial perfusion with a percutaneously inserted left ventricular assist device. J Nucl Cardiol 2010; 17:158–160.
- Sarnoff SJ, Braunwald E, Welch Jr GH, Case RB, Stainsby WN, Macruz R. Hemodynamic determinants of oxygen consumption of the heart with special reference to the tension-time index. Am J Physiol 1957; 192:148–156.
- Braunwald E. 50th anniversary historical article. Myocardial oxygen consumption: the quest for its determinants and some clinical fallout. J Am Coll Cardiol 1999; 34:1365–1368.
- Griffith BP, Anderson MB, Samuels LE, Pae WE Jr, Naka Y, Frazier OH. The RECOVER I: A multicenter prospective study of Impella 5.0/LD for postcardiotomy circulatory support. J Thorac Cardiovasc Surg 2013; 145:548–554
- Meyns B, Stolinski J, Leunens V, Verbeken E, Flameng W. Left ventricular support by cathteter-mounted axial flow pump reduces infarct size. J Am Coll Cardiol 2003; 41:1087–1095.
A 43-year-old man presented to a community hospital with acute chest pain and shortness of breath and was diagnosed with anterior ST-elevation myocardial infarction. He was a smoker with a history of alcohol abuse, hypertension, and hyperlipidemia, and in the past he had undergone percutaneous coronary interventions to the right coronary artery and the first obtuse marginal artery.
Angiography showed total occlusion in the left anterior descending artery, 90% stenosis in the right coronary artery, and mild disease in the left circumflex artery. A drug-eluting stent was placed in the left anterior descending artery, resulting in good blood flow.
However, his left ventricle continued to have severe dysfunction. An intra-aortic balloon pump was inserted. Afterward, computed tomography showed subsegmental pulmonary embolism with congestion. His mean arterial pressure was 60 mm Hg (normal 70–110), central venous pressure 12 mm Hg (3–8), pulmonary artery pressure 38/26 mm Hg (15–30/4–12), pulmonary capillary wedge pressure 24 mm Hg (2–15), and cardiac index 1.4 L/min (2.5–4).
The patient was started on dobutamine and norepinephrine and transferred to Cleveland Clinic on day 2. Over the next day, he had runs of ventricular tachycardia, for which he was given amiodarone and lidocaine. His urine output was low, and his serum creatinine was elevated at 1.65 mg/dL (baseline 1.2, normal 0.5–1.5). Liver function tests were also elevated, with aspartate aminotransferase at 115 U/L(14–40) and alanine aminotransferase at 187 U/L (10–54).
Poor oxygenation was evident: his arterial partial pressure of oxygen was 64 mm Hg (normal 75–100). He was intubated and given 100% oxygen with positive end-expiratory pressure of 12 cm H2O.
Echocardiography showed a left ventricular ejection fraction of 15% (normal 55%–70%) and mild right ventricular dysfunction.
ECMO and then Impella placement
On his third hospital day, a venoarterial extracorporeal membrane oxygenation (ECMO) device was placed peripherally (Figure 1).
His hemodynamic variables stabilized, and he was weaned off dobutamine and norepinephrine. Results of liver function tests normalized, his urinary output increased, and his serum creatinine dropped to a normal 1.0 mg/dL. However, a chest radiograph showed pulmonary congestion, and echocardiography now showed severe left ventricular dysfunction.
On hospital day 5, the patient underwent surgical placement of an Impella 5.0 device (Abiomed, Danvers, MA) through the right axillary artery in an effort to improve his pulmonary edema. The ECMO device was removed. Placement of a venovenous ECMO device was deemed unnecessary when oxygenation improved with the Impella.
Three days after Impella placement, radiography showed improved edema with some remaining pleural effusion.
ACUTE CARDIOGENIC SHOCK
Cardiogenic shock remains a challenging clinical problem: patients with it are among the sickest in the hospital, and many of them die. ECMO was once the only therapy available and is still widely used. However, it is a 2-edged sword; complications such as bleeding, infection, and thrombosis are almost inevitable if it is used for long. Importantly, patients are usually kept intubated and bedridden.
In recent years, new devices have become available that are easier to place (some in the catheterization laboratory or even at the bedside) and allow safer bridging to recovery, transplant, or other therapies.
This case illustrates the natural history of cardiogenic shock and the preferred clinical approach: ie, ongoing evaluation that permits rapid response to evolving challenges.
In general, acute cardiogenic shock occurs within 24 to 48 hours after the initial insult, so even if a procedure succeeds, the patient may develop progressive hypotension and organ dysfunction. Reduced cardiac output causes a downward spiral with multiple systemic and inflammatory processes as well as increased nitric oxide synthesis, leading to progressive decline and eventual end-organ dysfunction.
Continuously evaluate
The cardiac team should continuously assess the acuity and severity of a patient’s condition, with the goals of maintaining end-organ perfusion and identifying the source of problems. Refractory cardiogenic shock, with tissue hypoperfusion despite vasoactive medications and treatment of the underlying cause, is associated with in-hospital mortality rates ranging from 30% to 50%.1,2 The rates have actually increased over the past decade, as sicker patients are being treated.
When a patient presents with cardiogenic shock, we first try a series of vasoactive drugs and usually an intra-aortic balloon pump (Figure 2). We then tailor treatment depending on etiology. For example, a patient may have viral myocarditis and may even require a biopsy.
If cardiogenic shock is refractory, mechanical circulatory support devices can be a short-term bridge to either recovery or a new decision. A multidisciplinary team should be consulted to consider transplant, a long-term device, or palliative care. Sometimes a case requires “bridging to a bridge,” with several devices used short-term in turn.
Prognostic factors in cardiogenic shock
Several tools help predict outcome in a severely ill patient. End-organ function, indicated by blood lactate levels and estimated glomerular filtration rate, is perhaps the most informative and should be monitored serially.
CardShock3 is a simple scoring system based on age, mental status at presentation, laboratory values, and medical history. Patients receive 1 point for each of the following factors:
- Age > 75
- Confusion at presentation
- Previous myocardial infarction or coronary artery bypass grafting
- Acute coronary syndrome etiology
- Left ventricular ejection fraction < 40%
- Blood lactate level between 2 and 4 mmol/L, inclusively (2 points for lactate levels > 4 mmol/L)
- Estimated glomerular filtration rate between 30 and 60 mL/min/1.73 m2, inclusively (2 points if < 30 mL/min/1.73 m2).
Thus, scores range from 0 (best) to 9 (worst). A score of 0 to 3 points was associated with a 9% risk of death in the hospital, a score of 4 or 5 with a risk of 36%, and a score of 6 through 9 with a risk of 77%.3
The Survival After Veno-arterial ECMO (SAVE) score (www.save-score.com) is a prediction tool derived from a large international ECMO registry.4 It is based on patient age, diagnosis, and indicators of end-organ dysfunction. Scores range from –35 (worst) to +7 (best).
The mortality rate associated with postcardiotomy cardiogenic shock increases with the amount of inotropic support provided. In a 1996–1999 case series of patients who underwent open-heart surgery,5 the hospital mortality rate was 40% in those who received 2 inotropes in high doses and 80% in those who received 3. A strategy of early implementation of mechanical support is critical.
Selection criteria for destination therapy
Deciding whether a patient should receive a long-term device is frequently a challenge. The decision often must be based on limited information about not only the medical indications but also psychosocial factors that influence long-term success.
The Centers for Medicare and Medicaid Services have established criteria for candidates for left ventricular assist devices (LVADs) as destination therapy.6 Contraindications established for heart transplant should also be considered (Table 1).
CASE REVISITED
Several factors argued against LVAD placement in our patient. He had no health insurance and had been off medications. He smoked and said he consumed 3 hard liquor drinks per week. His Stanford Integrated Psychosocial Assessment for Transplantation score was 30 (minimally acceptable). He had hypoxia with subsegmental pulmonary edema, a strong contraindication to immediate transplant.
On the other hand, he had only mild right ventricular dysfunction. His CardShock score was 4 (intermediate risk, based on lactate 1.5 mmol/L and estimated glomerular filtration rate 52 mL/min/1.73 m2). His SAVE score was –9 (class IV), which overall is associated with a 30% risk of death (low enough to consider treatment).
During the patient’s time on temporary support, the team had the opportunity to better understand him and assess his family support and his ability to handle a permanent device. His surviving the acute course bolstered the team’s confidence that he could enjoy long-term survival with destination therapy.
CATHETERIZATION LABORATORY DEVICE CAPABILITIES
Although most implantation procedures are done in the operating room, they are often done in the catheterization laboratory because patients undergoing catheterization may not be stable enough for transfer, or an emergency intervention may be required during the night. Catheterization interventionists are also an important part of the team to help determine the best approach for long-term therapy.
The catheterization laboratory has multiple acute intervention options. Usually, decisions must be made quickly. In general, patients needing mechanical support are managed as follows:
- Those who need circulation support and oxygenation receive ECMO
- Those who need circulation support alone because of mechanical issues (eg, myocardial infarction) are considered for an intra-aortic balloon pump, Impella, or TandemHeart pump (Cardiac Assist, Pittsburgh, PA).
Factors that guide the selection of a temporary pump include:
- Left ventricular function
- Right ventricular function
- Aortic valve stenosis (some devices cannot be inserted through critical aortic stenosis)
- Aortic regurgitation (can affect some devices)
- Peripheral artery disease (some devices are large and must be placed percutaneously).
CHOOSING AMONG PERCUTANEOUS DEVICES
Circulatory support in cardiogenic shock improves outcomes, and devices play an important role in supporting high-risk procedures. The goal is not necessarily to use the device throughout the hospital stay. Acute stabilization is most important initially; a more considered decision about long-term therapy can be made when more is known about the patient.
Patient selection is the most important component of success. However, randomized data to support outcomes with the various devices are sparse and complicated by the critically ill state of the patient population.
SHORT-TERM CIRCULATORY SUPPORT: ECMO, IMPELLA, TANDEMHEART
A menu of options is available for temporary mechanical support. Options differ by their degree of circulatory support and ease of insertion (Table 2).
ECMO: A fast option with many advantages
ECMO has evolved and now can be placed quickly. A remote diagnostic platform such as CardioHub permits management at the bedside, in the medical unit, or in the cardiac intensive care unit.7
ECMO has several advantages. It can be used during cardiopulmonary bypass, it provides oxygenation, it is the only option in the setting of lung injury, it can be placed peripherally (without thoracotomy), and it is the only percutaneous option for biventricular support.
ECMO also has significant disadvantages
ECMO is a good device for acute resuscitation of a patient in shock, as it offers quick placement and resuscitation. But it is falling out of favor because of significant disadvantages.
Its major drawback is that it provides no left ventricular unloading. Although in a very unstable patient ECMO can stabilize end organs and restore their function, the lack of left ventricular unloading and reduced ventricular work threaten the myocardium. It creates extremely high afterload; therefore, in a left ventricle with poor function, wall tension and myocardial oxygen demand increase. Multiple studies have shown that coronary perfusion worsens, especially if the patient is cannulated peripherally. Because relative cerebral hypoxia occurs in many situations, it is imperative to check blood saturations at multiple sites to determine if perfusion is adequate everywhere.
Ineffective left ventricular unloading with venoarterial ECMO is managed in several ways. Sometimes left ventricular distention is slight and the effects are subtle. Left ventricular distention causing pulmonary edema can be addressed with:
- Inotropes (in moderate doses)
- Anticoagulation to prevent left ventricular thrombus formation
- An intra-aortic balloon pump. Most patients on ECMO already have an intra-aortic balloon pump in place, and it should be left in to provide additional support. For those who do not have one, it should be placed via the contralateral femoral artery.
If problems persist despite these measures, apical cannulation or left ventricular septostomy can be performed.
Outcomes with ECMO have been disappointing. Studies show that whether ECMO was indicated for cardiac failure or for respiratory failure, survival is only about 25% at 5 years. Analyzing data only for arteriovenous ECMO, survival was 48% in bridged patients and 41% in patients who were weaned.
The Extracorporeal Life Support Organization Registry, in their international summary from 2010, found that 34% of cardiac patients on ECMO survived to discharge or transfer. Most of these patients had cardiogenic shock from acute myocardial infarction. Outcomes are so poor because of complications endemic to ECMO, eg, dialysis-dependent renal failure (about 40%) and neurologic complications (about 30%), often involving ischemic or hemorrhagic stroke.
Limb and pump complications were also significant in the past. These have been reduced with the new reperfusion cannula and the Quadrox oxygenator.
Complications unique to ECMO should be understood and anticipated so that they can be avoided. Better tools are available, ie, Impella and TandemHeart.
Left-sided Impella: A longer-term temporary support
ECMO is a temporary fix that is usually used only for a few days. If longer support is needed, axillary placement of an Impella should be used as a bridge to recovery, transplant, or a durable LVAD.
The Impella device (Figure 3) is a miniature rotary blood pump increasingly used to treat cardiogenic shock. It is inserted retrograde across the aortic valve to provide short-term ventricular support. Most devices are approved by the US Food and Drug Administration (FDA) for less than 7 days of use, but we have experience using them up to 30 days. They are very hemocompatible, involving minimal hemolysis. Axillary placement allows early extubation and ambulation and is more stable than groin placement.
Several models are available: the 2.5 and 3.5 L/min devices can be placed percutaneously, while the 5 L/min model must be surgically placed in the axillary or groin region. Heparin is required with their use. They can replace ECMO. A right ventricular assist device (RVAD), Impella RP, is also available.
Physiologic impact of the Impella
The Impella fully unloads the left ventricle, reducing myocardial oxygen demand and increasing myocardial blood flow. It reduces end-diastolic volume and pressure, the mechanical work of the heart, and wall tension. Microvascular resistance is reduced, allowing increased coronary flow. Cardiac output and power are increased by multiple means.8–11
The RECOVER 1 trial evaluated the 5L Impella placed after cardiac surgery. The cardiac index increased in all the patients, and the systemic vascular resistance and wedge pressure decreased.12
Unloading the ventricle is critical. Meyns and colleagues13 found a fivefold reduction in infarct size from baseline in a left anterior descending occlusion model in pigs after off-loading the ventricle.
Impella has the advantage of simple percutaneous insertion (the 2.5 and CP models). It also tests right ventricular tolerance: if the right ventricle is doing well, one can predict with high certainty that it will tolerate an LVAD (eg, HeartWare, HeartMate 2 (Pleasanton, CA), or HeartMate 3 when available).
Disadvantages include that it provides only left ventricular support, although a right ventricular device can be inserted for dual support. Placement requires fluoroscopic or echocardiographic guidance.
TandemHeart requires septal puncture
The TandemHeart is approved for short-term and biventricular use. It consists of an extracorporeal centrifugal pump that withdraws blood from the left atrium via a trans-septal cannula placed through the femoral vein (Figure 4) and returns it to one or both femoral arteries. The blood is pumped at up to 5 L/min.
It is designed to reduce the pulmonary capillary wedge pressure, ventricular work, and myocardial oxygen demand and increase cardiac output and mean arterial pressure. It has the advantages of percutaneous placement and the ability to provide biventricular support with 2 devices. It can be used for up to 3 weeks. It can easily be converted to ECMO by either splicing in an oxygenator or adding another cannula.
Although the TandemHeart provides significant support, it is no longer often used. A 21F venous cannula must be passed to the left atrium by trans-septal puncture, which requires advanced skill and must be done in the catheterization laboratory. Insertion can take too much time and cause bleeding in patients taking an anticoagulant. Insertion usually destroys the septum, and removal requires a complete patch of the entire septum. Systemic anticoagulation is required. Other disadvantages are risks of hemolysis, limb ischemia, and infection with longer support times.
The CentriMag (Levitronix LLC; Framingham, MA) is an improved device that requires only 1 cannula instead of 2 to cover both areas.
DEVICES FOR RIGHT-SIDED SUPPORT
Most early devices were designed for left-sided support. The right heart, especially in failure, has been more difficult to manage. Previously the only option for a patient with right ventricular failure was venoarterial ECMO. This is more support than needed for a patient with isolated right ventricular failure and involves the risk of multiple complications from the device.
With more options available for the right heart (Table 3), we can choose the most appropriate device according to the underlying cause of right heart failure (eg, right ventricular infarct, pulmonary hypertension), the likelihood of recovery, and the expected time to recovery.
The ideal RVAD would be easy to implant, maintain, and remove. It would allow for chest closure and patient ambulation. It would be durable and biocompatible, so that it could remain implanted for months if necessary. It would cause little blood trauma, have the capability for adding an oxygenator for pulmonary support, and be cost-effective.
Although no single system has all these qualities, each available device fulfills certain combinations of these criteria, so the best one can be selected for each patient’s needs.
ECMO Rotaflow centrifugal pump: Fast, simple, inexpensive
A recent improvement to ECMO is the Rotaflow centrifugal pump (Maquet, Wayne, NJ), which is connected by sewing an 8-mm graft onto the pulmonary artery and placing a venous cannula in the femoral vein. If the patient is not bleeding, the chest can then be closed. This creates a fast, simple, and inexpensive temporary RVAD system. When the patient is ready to be weaned, the outflow graft can be disconnected at the bedside without reopening the chest.
The disadvantage is that the Rotaflow system contains a sapphire bearing. Although it is magnetically coupled, it generates heat and is a nidus for thrombus formation, which can lead to pump failure and embolization. This system can be used for patients who are expected to need support for less than 5 to 7 days. Beyond this duration, the incidence of complications increases.
CentriMag Ventricular Assist System offers right, left, or bilateral support
The CentriMag Ventricular Assist System is a fully magnetically levitated pump containing no bearings or seals, and with the same technology as is found in many of the durable devices such as HeartMate 3. It is coupled with a reusable motor and is easy to use.
CentriMag offers versatility, allowing for right, left, or bilateral ventricular support. An oxygenator can be added for pulmonary edema and additional support. It is the most biocompatible device and is FDA-approved for use for 4 weeks, although it has been used successfully for much longer. It allows for chest closure and ambulation. It is especially important as a bridge to transplant. The main disadvantage is that insertion and removal require sternotomy.
Impella RP: One size does not fit all
The Impella RP (Figure 5) has an 11F catheter diameter, 23F pump, and a maximum flow rate of more than 4 L/minute. It has a unique 3-dimensional cannula design based on computed tomography 3-dimensional reconstructions from hundreds of patients.
The device is biocompatible and can be used for support for more than 7 days, although most patients require only 3 or 4 days. There is almost no priming volume, so there is no hemodilution.
The disadvantages are that it is more challenging to place than other devices, and some patients cannot use it because the cannula does not fit. It also does not provide pulmonary support. Finally, it is the most expensive of the 3 right-sided devices.
CASE REVISITED
The patient described at the beginning of this article was extubated on day 12 but was then reintubated. On day 20, a tracheotomy tube was placed. By day 24, he had improved so little that his family signed a “do-not-resuscitate–comfort-care-arrest” order (ie, if the patient’s heart or breathing stops, only comfort care is to be provided).
But slowly he got better, and the Impella was removed on day 30. Afterward, serum creatinine and liver function tests began rising again, requiring dobutamine for heart support.
On day 34, his family reversed the do-not-resuscitate order, and he was reevaluated for an LVAD as destination therapy. At this point, echocardiography showed a left ventricular ejection fraction of 10%, normal right ventricular function, with a normal heartbeat and valves. On day 47, a HeartMate II LVAD was placed.
On postoperative day 18, he was transferred out of the intensive care unit, then discharged to an acute rehabilitation facility 8 days later (hospital day 73). He was subsequently discharged.
At a recent follow-up appointment, the patient said that he was feeling “pretty good” and walked with no shortness of breath.
A 43-year-old man presented to a community hospital with acute chest pain and shortness of breath and was diagnosed with anterior ST-elevation myocardial infarction. He was a smoker with a history of alcohol abuse, hypertension, and hyperlipidemia, and in the past he had undergone percutaneous coronary interventions to the right coronary artery and the first obtuse marginal artery.
Angiography showed total occlusion in the left anterior descending artery, 90% stenosis in the right coronary artery, and mild disease in the left circumflex artery. A drug-eluting stent was placed in the left anterior descending artery, resulting in good blood flow.
However, his left ventricle continued to have severe dysfunction. An intra-aortic balloon pump was inserted. Afterward, computed tomography showed subsegmental pulmonary embolism with congestion. His mean arterial pressure was 60 mm Hg (normal 70–110), central venous pressure 12 mm Hg (3–8), pulmonary artery pressure 38/26 mm Hg (15–30/4–12), pulmonary capillary wedge pressure 24 mm Hg (2–15), and cardiac index 1.4 L/min (2.5–4).
The patient was started on dobutamine and norepinephrine and transferred to Cleveland Clinic on day 2. Over the next day, he had runs of ventricular tachycardia, for which he was given amiodarone and lidocaine. His urine output was low, and his serum creatinine was elevated at 1.65 mg/dL (baseline 1.2, normal 0.5–1.5). Liver function tests were also elevated, with aspartate aminotransferase at 115 U/L(14–40) and alanine aminotransferase at 187 U/L (10–54).
Poor oxygenation was evident: his arterial partial pressure of oxygen was 64 mm Hg (normal 75–100). He was intubated and given 100% oxygen with positive end-expiratory pressure of 12 cm H2O.
Echocardiography showed a left ventricular ejection fraction of 15% (normal 55%–70%) and mild right ventricular dysfunction.
ECMO and then Impella placement
On his third hospital day, a venoarterial extracorporeal membrane oxygenation (ECMO) device was placed peripherally (Figure 1).
His hemodynamic variables stabilized, and he was weaned off dobutamine and norepinephrine. Results of liver function tests normalized, his urinary output increased, and his serum creatinine dropped to a normal 1.0 mg/dL. However, a chest radiograph showed pulmonary congestion, and echocardiography now showed severe left ventricular dysfunction.
On hospital day 5, the patient underwent surgical placement of an Impella 5.0 device (Abiomed, Danvers, MA) through the right axillary artery in an effort to improve his pulmonary edema. The ECMO device was removed. Placement of a venovenous ECMO device was deemed unnecessary when oxygenation improved with the Impella.
Three days after Impella placement, radiography showed improved edema with some remaining pleural effusion.
ACUTE CARDIOGENIC SHOCK
Cardiogenic shock remains a challenging clinical problem: patients with it are among the sickest in the hospital, and many of them die. ECMO was once the only therapy available and is still widely used. However, it is a 2-edged sword; complications such as bleeding, infection, and thrombosis are almost inevitable if it is used for long. Importantly, patients are usually kept intubated and bedridden.
In recent years, new devices have become available that are easier to place (some in the catheterization laboratory or even at the bedside) and allow safer bridging to recovery, transplant, or other therapies.
This case illustrates the natural history of cardiogenic shock and the preferred clinical approach: ie, ongoing evaluation that permits rapid response to evolving challenges.
In general, acute cardiogenic shock occurs within 24 to 48 hours after the initial insult, so even if a procedure succeeds, the patient may develop progressive hypotension and organ dysfunction. Reduced cardiac output causes a downward spiral with multiple systemic and inflammatory processes as well as increased nitric oxide synthesis, leading to progressive decline and eventual end-organ dysfunction.
Continuously evaluate
The cardiac team should continuously assess the acuity and severity of a patient’s condition, with the goals of maintaining end-organ perfusion and identifying the source of problems. Refractory cardiogenic shock, with tissue hypoperfusion despite vasoactive medications and treatment of the underlying cause, is associated with in-hospital mortality rates ranging from 30% to 50%.1,2 The rates have actually increased over the past decade, as sicker patients are being treated.
When a patient presents with cardiogenic shock, we first try a series of vasoactive drugs and usually an intra-aortic balloon pump (Figure 2). We then tailor treatment depending on etiology. For example, a patient may have viral myocarditis and may even require a biopsy.
If cardiogenic shock is refractory, mechanical circulatory support devices can be a short-term bridge to either recovery or a new decision. A multidisciplinary team should be consulted to consider transplant, a long-term device, or palliative care. Sometimes a case requires “bridging to a bridge,” with several devices used short-term in turn.
Prognostic factors in cardiogenic shock
Several tools help predict outcome in a severely ill patient. End-organ function, indicated by blood lactate levels and estimated glomerular filtration rate, is perhaps the most informative and should be monitored serially.
CardShock3 is a simple scoring system based on age, mental status at presentation, laboratory values, and medical history. Patients receive 1 point for each of the following factors:
- Age > 75
- Confusion at presentation
- Previous myocardial infarction or coronary artery bypass grafting
- Acute coronary syndrome etiology
- Left ventricular ejection fraction < 40%
- Blood lactate level between 2 and 4 mmol/L, inclusively (2 points for lactate levels > 4 mmol/L)
- Estimated glomerular filtration rate between 30 and 60 mL/min/1.73 m2, inclusively (2 points if < 30 mL/min/1.73 m2).
Thus, scores range from 0 (best) to 9 (worst). A score of 0 to 3 points was associated with a 9% risk of death in the hospital, a score of 4 or 5 with a risk of 36%, and a score of 6 through 9 with a risk of 77%.3
The Survival After Veno-arterial ECMO (SAVE) score (www.save-score.com) is a prediction tool derived from a large international ECMO registry.4 It is based on patient age, diagnosis, and indicators of end-organ dysfunction. Scores range from –35 (worst) to +7 (best).
The mortality rate associated with postcardiotomy cardiogenic shock increases with the amount of inotropic support provided. In a 1996–1999 case series of patients who underwent open-heart surgery,5 the hospital mortality rate was 40% in those who received 2 inotropes in high doses and 80% in those who received 3. A strategy of early implementation of mechanical support is critical.
Selection criteria for destination therapy
Deciding whether a patient should receive a long-term device is frequently a challenge. The decision often must be based on limited information about not only the medical indications but also psychosocial factors that influence long-term success.
The Centers for Medicare and Medicaid Services have established criteria for candidates for left ventricular assist devices (LVADs) as destination therapy.6 Contraindications established for heart transplant should also be considered (Table 1).
CASE REVISITED
Several factors argued against LVAD placement in our patient. He had no health insurance and had been off medications. He smoked and said he consumed 3 hard liquor drinks per week. His Stanford Integrated Psychosocial Assessment for Transplantation score was 30 (minimally acceptable). He had hypoxia with subsegmental pulmonary edema, a strong contraindication to immediate transplant.
On the other hand, he had only mild right ventricular dysfunction. His CardShock score was 4 (intermediate risk, based on lactate 1.5 mmol/L and estimated glomerular filtration rate 52 mL/min/1.73 m2). His SAVE score was –9 (class IV), which overall is associated with a 30% risk of death (low enough to consider treatment).
During the patient’s time on temporary support, the team had the opportunity to better understand him and assess his family support and his ability to handle a permanent device. His surviving the acute course bolstered the team’s confidence that he could enjoy long-term survival with destination therapy.
CATHETERIZATION LABORATORY DEVICE CAPABILITIES
Although most implantation procedures are done in the operating room, they are often done in the catheterization laboratory because patients undergoing catheterization may not be stable enough for transfer, or an emergency intervention may be required during the night. Catheterization interventionists are also an important part of the team to help determine the best approach for long-term therapy.
The catheterization laboratory has multiple acute intervention options. Usually, decisions must be made quickly. In general, patients needing mechanical support are managed as follows:
- Those who need circulation support and oxygenation receive ECMO
- Those who need circulation support alone because of mechanical issues (eg, myocardial infarction) are considered for an intra-aortic balloon pump, Impella, or TandemHeart pump (Cardiac Assist, Pittsburgh, PA).
Factors that guide the selection of a temporary pump include:
- Left ventricular function
- Right ventricular function
- Aortic valve stenosis (some devices cannot be inserted through critical aortic stenosis)
- Aortic regurgitation (can affect some devices)
- Peripheral artery disease (some devices are large and must be placed percutaneously).
CHOOSING AMONG PERCUTANEOUS DEVICES
Circulatory support in cardiogenic shock improves outcomes, and devices play an important role in supporting high-risk procedures. The goal is not necessarily to use the device throughout the hospital stay. Acute stabilization is most important initially; a more considered decision about long-term therapy can be made when more is known about the patient.
Patient selection is the most important component of success. However, randomized data to support outcomes with the various devices are sparse and complicated by the critically ill state of the patient population.
SHORT-TERM CIRCULATORY SUPPORT: ECMO, IMPELLA, TANDEMHEART
A menu of options is available for temporary mechanical support. Options differ by their degree of circulatory support and ease of insertion (Table 2).
ECMO: A fast option with many advantages
ECMO has evolved and now can be placed quickly. A remote diagnostic platform such as CardioHub permits management at the bedside, in the medical unit, or in the cardiac intensive care unit.7
ECMO has several advantages. It can be used during cardiopulmonary bypass, it provides oxygenation, it is the only option in the setting of lung injury, it can be placed peripherally (without thoracotomy), and it is the only percutaneous option for biventricular support.
ECMO also has significant disadvantages
ECMO is a good device for acute resuscitation of a patient in shock, as it offers quick placement and resuscitation. But it is falling out of favor because of significant disadvantages.
Its major drawback is that it provides no left ventricular unloading. Although in a very unstable patient ECMO can stabilize end organs and restore their function, the lack of left ventricular unloading and reduced ventricular work threaten the myocardium. It creates extremely high afterload; therefore, in a left ventricle with poor function, wall tension and myocardial oxygen demand increase. Multiple studies have shown that coronary perfusion worsens, especially if the patient is cannulated peripherally. Because relative cerebral hypoxia occurs in many situations, it is imperative to check blood saturations at multiple sites to determine if perfusion is adequate everywhere.
Ineffective left ventricular unloading with venoarterial ECMO is managed in several ways. Sometimes left ventricular distention is slight and the effects are subtle. Left ventricular distention causing pulmonary edema can be addressed with:
- Inotropes (in moderate doses)
- Anticoagulation to prevent left ventricular thrombus formation
- An intra-aortic balloon pump. Most patients on ECMO already have an intra-aortic balloon pump in place, and it should be left in to provide additional support. For those who do not have one, it should be placed via the contralateral femoral artery.
If problems persist despite these measures, apical cannulation or left ventricular septostomy can be performed.
Outcomes with ECMO have been disappointing. Studies show that whether ECMO was indicated for cardiac failure or for respiratory failure, survival is only about 25% at 5 years. Analyzing data only for arteriovenous ECMO, survival was 48% in bridged patients and 41% in patients who were weaned.
The Extracorporeal Life Support Organization Registry, in their international summary from 2010, found that 34% of cardiac patients on ECMO survived to discharge or transfer. Most of these patients had cardiogenic shock from acute myocardial infarction. Outcomes are so poor because of complications endemic to ECMO, eg, dialysis-dependent renal failure (about 40%) and neurologic complications (about 30%), often involving ischemic or hemorrhagic stroke.
Limb and pump complications were also significant in the past. These have been reduced with the new reperfusion cannula and the Quadrox oxygenator.
Complications unique to ECMO should be understood and anticipated so that they can be avoided. Better tools are available, ie, Impella and TandemHeart.
Left-sided Impella: A longer-term temporary support
ECMO is a temporary fix that is usually used only for a few days. If longer support is needed, axillary placement of an Impella should be used as a bridge to recovery, transplant, or a durable LVAD.
The Impella device (Figure 3) is a miniature rotary blood pump increasingly used to treat cardiogenic shock. It is inserted retrograde across the aortic valve to provide short-term ventricular support. Most devices are approved by the US Food and Drug Administration (FDA) for less than 7 days of use, but we have experience using them up to 30 days. They are very hemocompatible, involving minimal hemolysis. Axillary placement allows early extubation and ambulation and is more stable than groin placement.
Several models are available: the 2.5 and 3.5 L/min devices can be placed percutaneously, while the 5 L/min model must be surgically placed in the axillary or groin region. Heparin is required with their use. They can replace ECMO. A right ventricular assist device (RVAD), Impella RP, is also available.
Physiologic impact of the Impella
The Impella fully unloads the left ventricle, reducing myocardial oxygen demand and increasing myocardial blood flow. It reduces end-diastolic volume and pressure, the mechanical work of the heart, and wall tension. Microvascular resistance is reduced, allowing increased coronary flow. Cardiac output and power are increased by multiple means.8–11
The RECOVER 1 trial evaluated the 5L Impella placed after cardiac surgery. The cardiac index increased in all the patients, and the systemic vascular resistance and wedge pressure decreased.12
Unloading the ventricle is critical. Meyns and colleagues13 found a fivefold reduction in infarct size from baseline in a left anterior descending occlusion model in pigs after off-loading the ventricle.
Impella has the advantage of simple percutaneous insertion (the 2.5 and CP models). It also tests right ventricular tolerance: if the right ventricle is doing well, one can predict with high certainty that it will tolerate an LVAD (eg, HeartWare, HeartMate 2 (Pleasanton, CA), or HeartMate 3 when available).
Disadvantages include that it provides only left ventricular support, although a right ventricular device can be inserted for dual support. Placement requires fluoroscopic or echocardiographic guidance.
TandemHeart requires septal puncture
The TandemHeart is approved for short-term and biventricular use. It consists of an extracorporeal centrifugal pump that withdraws blood from the left atrium via a trans-septal cannula placed through the femoral vein (Figure 4) and returns it to one or both femoral arteries. The blood is pumped at up to 5 L/min.
It is designed to reduce the pulmonary capillary wedge pressure, ventricular work, and myocardial oxygen demand and increase cardiac output and mean arterial pressure. It has the advantages of percutaneous placement and the ability to provide biventricular support with 2 devices. It can be used for up to 3 weeks. It can easily be converted to ECMO by either splicing in an oxygenator or adding another cannula.
Although the TandemHeart provides significant support, it is no longer often used. A 21F venous cannula must be passed to the left atrium by trans-septal puncture, which requires advanced skill and must be done in the catheterization laboratory. Insertion can take too much time and cause bleeding in patients taking an anticoagulant. Insertion usually destroys the septum, and removal requires a complete patch of the entire septum. Systemic anticoagulation is required. Other disadvantages are risks of hemolysis, limb ischemia, and infection with longer support times.
The CentriMag (Levitronix LLC; Framingham, MA) is an improved device that requires only 1 cannula instead of 2 to cover both areas.
DEVICES FOR RIGHT-SIDED SUPPORT
Most early devices were designed for left-sided support. The right heart, especially in failure, has been more difficult to manage. Previously the only option for a patient with right ventricular failure was venoarterial ECMO. This is more support than needed for a patient with isolated right ventricular failure and involves the risk of multiple complications from the device.
With more options available for the right heart (Table 3), we can choose the most appropriate device according to the underlying cause of right heart failure (eg, right ventricular infarct, pulmonary hypertension), the likelihood of recovery, and the expected time to recovery.
The ideal RVAD would be easy to implant, maintain, and remove. It would allow for chest closure and patient ambulation. It would be durable and biocompatible, so that it could remain implanted for months if necessary. It would cause little blood trauma, have the capability for adding an oxygenator for pulmonary support, and be cost-effective.
Although no single system has all these qualities, each available device fulfills certain combinations of these criteria, so the best one can be selected for each patient’s needs.
ECMO Rotaflow centrifugal pump: Fast, simple, inexpensive
A recent improvement to ECMO is the Rotaflow centrifugal pump (Maquet, Wayne, NJ), which is connected by sewing an 8-mm graft onto the pulmonary artery and placing a venous cannula in the femoral vein. If the patient is not bleeding, the chest can then be closed. This creates a fast, simple, and inexpensive temporary RVAD system. When the patient is ready to be weaned, the outflow graft can be disconnected at the bedside without reopening the chest.
The disadvantage is that the Rotaflow system contains a sapphire bearing. Although it is magnetically coupled, it generates heat and is a nidus for thrombus formation, which can lead to pump failure and embolization. This system can be used for patients who are expected to need support for less than 5 to 7 days. Beyond this duration, the incidence of complications increases.
CentriMag Ventricular Assist System offers right, left, or bilateral support
The CentriMag Ventricular Assist System is a fully magnetically levitated pump containing no bearings or seals, and with the same technology as is found in many of the durable devices such as HeartMate 3. It is coupled with a reusable motor and is easy to use.
CentriMag offers versatility, allowing for right, left, or bilateral ventricular support. An oxygenator can be added for pulmonary edema and additional support. It is the most biocompatible device and is FDA-approved for use for 4 weeks, although it has been used successfully for much longer. It allows for chest closure and ambulation. It is especially important as a bridge to transplant. The main disadvantage is that insertion and removal require sternotomy.
Impella RP: One size does not fit all
The Impella RP (Figure 5) has an 11F catheter diameter, 23F pump, and a maximum flow rate of more than 4 L/minute. It has a unique 3-dimensional cannula design based on computed tomography 3-dimensional reconstructions from hundreds of patients.
The device is biocompatible and can be used for support for more than 7 days, although most patients require only 3 or 4 days. There is almost no priming volume, so there is no hemodilution.
The disadvantages are that it is more challenging to place than other devices, and some patients cannot use it because the cannula does not fit. It also does not provide pulmonary support. Finally, it is the most expensive of the 3 right-sided devices.
CASE REVISITED
The patient described at the beginning of this article was extubated on day 12 but was then reintubated. On day 20, a tracheotomy tube was placed. By day 24, he had improved so little that his family signed a “do-not-resuscitate–comfort-care-arrest” order (ie, if the patient’s heart or breathing stops, only comfort care is to be provided).
But slowly he got better, and the Impella was removed on day 30. Afterward, serum creatinine and liver function tests began rising again, requiring dobutamine for heart support.
On day 34, his family reversed the do-not-resuscitate order, and he was reevaluated for an LVAD as destination therapy. At this point, echocardiography showed a left ventricular ejection fraction of 10%, normal right ventricular function, with a normal heartbeat and valves. On day 47, a HeartMate II LVAD was placed.
On postoperative day 18, he was transferred out of the intensive care unit, then discharged to an acute rehabilitation facility 8 days later (hospital day 73). He was subsequently discharged.
At a recent follow-up appointment, the patient said that he was feeling “pretty good” and walked with no shortness of breath.
- Reyentovich A, Barghash MH, Hochman JS. Management of refractory cardiogenic shock. Nat Rev Cardiol 2016; 13:481–492.
- Wayangankar SA, Bangalore S, McCoy LA, et al. Temporal trends and outcomes of patients undergoing percutaneous coronary interventions for cardiogenic shock in the setting of acute myocardial infarction: a report from the CathPCI registry. JACC Cardiovasc Interv 2016; 9:341–351.
- Harjola VP, Lassus J, Sionis A, et al; CardShock Study Investigators; GREAT network. Clinical picture and risk prediction of short-term mortality in cardiogenic shock. Eur J Heart Fail 2015; 17:501–509.
- Schmidt M, Burrell A, Roberts L, et al. Predicting survival after ECMO for refractory cardiogenic shock: the survival after veno-arterial-ECMO (SAVE)-score. Eur Heart J 2015; 36:2246–2256.
- Samuels LE, Kaufman MS, Thomas MP, Holmes EC, Brockman SK, Wechsler AS. Pharmacological criteria for ventricular assist device insertion following postcardiotomy shock: experience with the Abiomed BVS system. J Card Surg 1999; 14:288–293.
- Centers for Medicare & Medicaid Services. Decision memo for ventricular assist devices as destination therapy (CAG-00119R2). www.cms.gov/medicare-coverage-database/details/nca-decision-memo.aspx?NCAId=243&ver=9&NcaName=Ventricular+Assist+Devices+as+Destination+Therapy+(2nd+Recon)&bc=BEAAAAAAEAAA&&fromdb=true. Accessed March 10, 2017.
- Kulkarni T, Sharma NS, Diaz-Guzman E. Extracorporeal membrane oxygenation in adults: a practical guide for internists. Cleve Clin J Med 2016; 83:373–384.
- Remmelink M, Sjauw KD, Henriques JP, et al. Effects of left ventricular unloading by Impella Recover LP2.5 on coronary hemodynamics. Catheter Cardiovasc Interv 2007; 70:532–537.
- Aqel RA, Hage FG, Iskandrian AE. Improvement of myocardial perfusion with a percutaneously inserted left ventricular assist device. J Nucl Cardiol 2010; 17:158–160.
- Sarnoff SJ, Braunwald E, Welch Jr GH, Case RB, Stainsby WN, Macruz R. Hemodynamic determinants of oxygen consumption of the heart with special reference to the tension-time index. Am J Physiol 1957; 192:148–156.
- Braunwald E. 50th anniversary historical article. Myocardial oxygen consumption: the quest for its determinants and some clinical fallout. J Am Coll Cardiol 1999; 34:1365–1368.
- Griffith BP, Anderson MB, Samuels LE, Pae WE Jr, Naka Y, Frazier OH. The RECOVER I: A multicenter prospective study of Impella 5.0/LD for postcardiotomy circulatory support. J Thorac Cardiovasc Surg 2013; 145:548–554
- Meyns B, Stolinski J, Leunens V, Verbeken E, Flameng W. Left ventricular support by cathteter-mounted axial flow pump reduces infarct size. J Am Coll Cardiol 2003; 41:1087–1095.
- Reyentovich A, Barghash MH, Hochman JS. Management of refractory cardiogenic shock. Nat Rev Cardiol 2016; 13:481–492.
- Wayangankar SA, Bangalore S, McCoy LA, et al. Temporal trends and outcomes of patients undergoing percutaneous coronary interventions for cardiogenic shock in the setting of acute myocardial infarction: a report from the CathPCI registry. JACC Cardiovasc Interv 2016; 9:341–351.
- Harjola VP, Lassus J, Sionis A, et al; CardShock Study Investigators; GREAT network. Clinical picture and risk prediction of short-term mortality in cardiogenic shock. Eur J Heart Fail 2015; 17:501–509.
- Schmidt M, Burrell A, Roberts L, et al. Predicting survival after ECMO for refractory cardiogenic shock: the survival after veno-arterial-ECMO (SAVE)-score. Eur Heart J 2015; 36:2246–2256.
- Samuels LE, Kaufman MS, Thomas MP, Holmes EC, Brockman SK, Wechsler AS. Pharmacological criteria for ventricular assist device insertion following postcardiotomy shock: experience with the Abiomed BVS system. J Card Surg 1999; 14:288–293.
- Centers for Medicare & Medicaid Services. Decision memo for ventricular assist devices as destination therapy (CAG-00119R2). www.cms.gov/medicare-coverage-database/details/nca-decision-memo.aspx?NCAId=243&ver=9&NcaName=Ventricular+Assist+Devices+as+Destination+Therapy+(2nd+Recon)&bc=BEAAAAAAEAAA&&fromdb=true. Accessed March 10, 2017.
- Kulkarni T, Sharma NS, Diaz-Guzman E. Extracorporeal membrane oxygenation in adults: a practical guide for internists. Cleve Clin J Med 2016; 83:373–384.
- Remmelink M, Sjauw KD, Henriques JP, et al. Effects of left ventricular unloading by Impella Recover LP2.5 on coronary hemodynamics. Catheter Cardiovasc Interv 2007; 70:532–537.
- Aqel RA, Hage FG, Iskandrian AE. Improvement of myocardial perfusion with a percutaneously inserted left ventricular assist device. J Nucl Cardiol 2010; 17:158–160.
- Sarnoff SJ, Braunwald E, Welch Jr GH, Case RB, Stainsby WN, Macruz R. Hemodynamic determinants of oxygen consumption of the heart with special reference to the tension-time index. Am J Physiol 1957; 192:148–156.
- Braunwald E. 50th anniversary historical article. Myocardial oxygen consumption: the quest for its determinants and some clinical fallout. J Am Coll Cardiol 1999; 34:1365–1368.
- Griffith BP, Anderson MB, Samuels LE, Pae WE Jr, Naka Y, Frazier OH. The RECOVER I: A multicenter prospective study of Impella 5.0/LD for postcardiotomy circulatory support. J Thorac Cardiovasc Surg 2013; 145:548–554
- Meyns B, Stolinski J, Leunens V, Verbeken E, Flameng W. Left ventricular support by cathteter-mounted axial flow pump reduces infarct size. J Am Coll Cardiol 2003; 41:1087–1095.
KEY POINTS
- ECMO is the fastest way to stabilize a patient in acute cardiogenic shock and prevent end-organ failure, but it should likely be used for a short time and does not reduce the work of (“unload”) the left ventricle.
- An intra-aortic balloon pump may provide diastolic filling in a patient on ECMO.
- The TandemHeart provides significant support, but its insertion requires puncture of the atrial septum.
- The Impella fully unloads the left ventricle, critically reducing the work of the heart.
- Options for right-ventricular support include the ECMO Rotaflow circuit, CentriMag, and Impella RP.
- The CentriMag is the most versatile device, allowing right, left, or biventricular support, but placement requires sternotomy.