User login
Antimicrobial Stewardship Programs: Effects on Clinical and Economic Outcomes and Future Directions
From the Ernest Mario School of Pharmacy, Rutgers, The State University of New Jersey, Piscataway, NJ.
Abstract
- Objective: To review the evidence evaluating inpatient antimicrobial stewardship programs (ASPs) with a focus on clinical and economic outcomes.
- Methods: Pubmed/MEDLINE and the Cochrane Database of Systematic Reviews were used to identify systematic reviews, meta-analyses, randomized controlled trials, and other relevant literature evaluating the clinical and economic impact of ASP interventions.
- Results: A total of 5 meta-analyses, 3 systematic reviews, and 10 clinical studies (2 randomized controlled, 5 observational, and 3 quasi-experimental studies) were identified for analysis. ASPs were associated with a reduction in antimicrobial consumption and use. However, due to the heterogeneity of outcomes measured among studies, the effectiveness of ASPs varied with the measures used. There are data supporting the cost savings associated with ASPs, but these studies are more sparse. Most of the available evidence supporting ASPs is of low quality, and intervention strategies vary widely among available studies.
- Conclusion: Much of the evidence reviewed supports the assertion that ASPs result in a more judicious use of antimicrobials and lead to better patient care in the inpatient setting. While clinical outcomes vary between programs, there are ubiquitous positive benefits associated with ASPs in terms of antimicrobial consumption, C. difficile infection rates, and resistance, with few adverse effects. To date, economic outcomes have been difficult to uniformly quantify, but there are data supporting the economic benefits of ASPs. As the number of ASPs continues to grow, it is imperative that standardized metrics are considered in order to accurately measure the benefits of these essential programs.
Key words: Antimicrobial stewardship; antimicrobial consumption; resistance.
Antimicrobial resistance is a public health concern that has been escalating over the years and is now identified as a global crisis [1–3]. This is partly due to the widespread use of the same antibiotics that have existed for decades, combined with a lack of sufficient novel antibiotic discovery and development [4]. Bacteria that are resistant to our last-line-of-defense medications have recently emerged, and these resistant organisms may spread to treatment-naive patients [5]. Multidrug-resistant organisms are often found, treated, and likely originate within the hospital practice setting, where antimicrobials can be prescribed by any licensed provider [6]. Upwards of 50% of antibiotics administered are unnecessary and contribute to the problem of increasing resistance [7]. The seriousness of this situation is increasingly apparent; in 2014 the World Health Organization (WHO), President Obama, and Prime Minister Cameron issued statements urging solutions to the resistance crisis [8].
While the urgency of the situation is recognized today, efforts aimed at a more judicious use of antibiotics to curb resistance began as early as the 1960s and led to the first antimicrobial stewardship programs (ASPs) [9–11]. ASPs have since been defined as “coordinated interventions designed to improve and measure the appropriate use of antimicrobial agents by promoting the selection of the optimal antimicrobial drug regimen including dosing, duration of therapy, and route of administration” [1]. The primary objectives of these types of programs are to avoid or reduce adverse events (eg, Clostridium difficile infection) and resistance driven by a shift in minimum inhibitory concentrations (MICs) and to reverse the unnecessary economic burden caused by the inappropriate prescribing of these agents [1].
This article examines the evidence evaluating the reported effectiveness of inpatient ASPs, examining both clinical and economic outcomes. In addition, we touch on ASP history, current status, and future directions in light of current trends. While ASPs are expanding into the outpatient and nursing home settings, we will limit our review here to the inpatient setting.
Historical Background
Modern antibiotics date back to the late 1930s when penicillin and sulfonamides were introduced to the medical market, and resistance to these drug classes was reported just a few years after their introduction. The same bacterial resistance mechanisms that neutralized their efficacy then exist today, and these mechanisms continue to confer resistance among those classes [5].
While “stewardship” was not described as such until the late 1990s [12], institutions have historically been proactive in creating standards around antimicrobial utilization to encourage judicious use of these agents. The earliest form of tracking antibiotic use was in the form of paper charts as “antibiotic logs” [9] and “punch cards” [10] in the 1960s. The idea of a team approach to stewardship dates back to the 1970s, with the example of Hartford Hospital in Hartford, Connecticut, which employed an antimicrobial standards model run by an infectious disease (ID) physician and clinical pharmacists [11]. In 1977, the Infectious Diseases Society of America (IDSA) released a statement that clinical pharmacists may have a substantial impact on patient care, including in ID, contributing to the idea that a team of physicians collaborating with pharmacists presents the best way to combat inappropriate medication use. Pharmacist involvement has since been shown to restrict broad overutilized antimicrobial agents and reduce the rate of C. difficile infection by a significant amount [13].
In 1997 the IDSA and the Society for Healthcare Epidemiology of America (SHEA) published guidelines to assist in the prevention of the growing issue of resistance, mentioning the importance of antimicrobial stewardship [14]. A decade later they released joint guidelines for ASP implementation [15], and the Pediatric Infectious Disease Society (PIDS) joined them in 2012 to publish a joint statement acknowledging and endorsing stewardship [16]. In 2014, the Centers of Disease Control and Prevention (CDC) recommended that every hospital should have an ASP. As of 1 January 2017, the Joint Commission requires an ASP as a standard for accreditation at hospitals, critical access hospitals, and nursing care [17]. Guidelines for implementation of an ASP are currently available through the IDSA and SHEA [1,16].
ASP Interventions
There are 2 main strategies that ASPs have to combat inappropriate antimicrobial use, and each has its own set of systematic interventions. These strategies are referred to as “prospective audit with intervention and feedback” and “prior authorization” [6]. Although most ASPs will incorporate these main strategies, each institution typically creates its own strategies and regulations independently.
Prospective audit with intervention and feedback describes the process of providing recommendations after reviewing utilization and trends of antimicrobial use. This is sometimes referred to as the “back-end” intervention, in which decisions are made after antibiotics have been administered. Interventions that are commonly used under this strategy include discontinuation of antibiotics due to culture data, de-escalation to drugs with narrower spectra, IV to oral conversions, and cessation of surgical prophylaxis [6].
Prior authorization, also referred to as a “front-end” intervention, is the process of approving medications before they are used. Interventions include a restricted formulary for antimicrobials that can be managed through a paging system or a built-in computer restriction program, as well as other guidelines and protocols for dosing and duration of therapy. Restrictions typically focus on broad spectrum antibiotics as well as the more costly drugs on formularies. These solutions reduce the need for manual intervention as technology makes it possible to create automated restriction-based services that prevent inappropriate prescribing [6].
Aside from these main techniques, other strategies are taken to achieve the goal of attaining optimal clinical outcomes while limiting further antimicrobial resistance and adverse effects. Different clinical settings have different needs, and ASPs are customized to each setting’s resources, prescribing habits, and other local specificities [1]. These differences present difficulty with interpreting diverse datasets, but certain themes arise in the literature: commonly assessed clinical outcomes of inpatient ASPs include hospital length of stay (LOS) and readmission, reinfection, mortality, and resistance rates. These outcomes are putatively driven by the more prudent use of antimicrobials, particularly by decreased rates of antimicrobial consumption.
ASP Team Members
While ASPs may differ between institutions, the staff members involved are typically the same, and leadership is always an important aspect of a program. The CDC recommends that ASP leadership consist of a program leader (an ID physician) and a pharmacy leader, who co-lead the team [18]. In addition, the Joint Commission recommends that the multidisciplinary team should include an infection preventionist (ie, infection control and hospital epidemiologist) and practitioner [17]; these specialists have a role in prevention, awareness, and policy [19]. The integration of infection control with stewardship yields the best results [15], as infection control aims to prevent antibiotic use altogether, while stewardship increases the quality of antibiotic regimens that are being prescribed [20].
It is also beneficial to incorporate a microbiologist as an integral part of the team, responsible for performing and interpreting laboratory data (ie, cultures). Nurses should be integrated into ASPs due to the overlap of their routine activities with ASP interventions [21]; other clinicians (regardless of their infectious disease clinical background), quality control, information technology, and environmental services should all collaborate in the hospital-wide systems related to the program where appropriate [18].
Evidence Review
Results
Antimicrobial Usage
The most widely studied aspect of ASPs in the current review was the effect of ASP interventions on antimicrobial consumption and use. Three systematic reviews [22–24] showed improved antibiotic prescribing practices and reduced consumption rates overall, as did several studies inside and outside the intensive care unit (ICU) [25–31].One study found an insignificant declining usage trend [32]. An important underlying facet of this observation is that even as total antibiotic consumption decreases, certain antibiotic and antibiotic class consumption may increase. This is evident in several studies, which showed that as aminoglycoside, carbapenem, and β-lactam-β-lactamase inhibitor use increased, clindamycin (1 case), glycopeptide, fluoroquinolone, and macrolide use decreased [27,28,30]. A potential confounding factor relating to decreased glycopeptide use in Bevilacqua et al [30] was that there was an epidemic of glycopeptide-resistant enterococci during the study period, potentially causing prescribers to naturally avoid it. In any case, since the aim of ASPs is to encourage a more judicious usage of antimicrobials, the observed decreases in consumption of those restricted medications is intuitive. These observations about antimicrobial consumption related to ASPs are relevant because they putatively drive improvements in clinical outcomes, especially those related to reduced adverse events associated with these agents, such as the risk of C. difficile infection with certain drugs (eg, fluoroquinolones, clindamycin, and broad-spectrum antibiotics) and prolonged antibiotic usage [33–35]. There is evidence that these benefits are not limited to antibiotics but extend to antifungal agents and possibly antivirals [22,27,36].
Utilization, Mortality, and Infection Rates
ASPs typically intend to improve patient-focused clinical parameters such as hospital LOS, hospital readmissions, mortality, and incidence of infections acquired secondary to antibiotic usage during a hospital stay, especially C. difficile infection. Most of the reviewed evidence indicates that there has been no significant LOS benefit due to stewardship interventions [24–26,32,37], and one meta-analysis noted that when overall hospital LOS was significantly reduced, ICU-specific LOS was not [22]. Generally, there was also not a significant change in hospital readmission rates [24,26,32]. However, 2 retrospective observational studies found mixed results for both LOS and readmission rates relative to ASP interventions; while both noted a significantly reduced LOS, one study [38] showed an all-cause readmission benefit in a fairly healthy patient population (but no benefit for readmissions due to the specific infections of interest), and the another [29] showed a benefit for readmissions due to infections but an increased rate of readmissions in the intervention group overall. In this latter study, hospitalizations within the previous 3 months were significantly higher at baseline for the intervention group (55% vs. 46%, P = 0.042), suggesting sicker patients and possibly providing an explanation for this unique observation. Even so, a meta-analysis of 5 studies found a significantly elevated risk of readmission associated with ASP interventions (RR 1.26, 95% CI 1.02–1.57; P = 0.03); the authors noted that non–infection-related readmissions accounted for 61% of readmissions, but this was not significantly different between intervention and non-intervention arms [37].
With regard to mortality, most studies found no significant reductions related to stewardship interventions [22,24,26,29,32]. In a prospective randomized controlled trial, all reported deaths (7/160, 4.4%) were in the ASP intervention arm, but these were attributed to the severities of infection or an underlying, chronic disease [25]. One meta-analysis, however, found that there were significant mortality reductions related to stewardship guidelines for empirical antibiotic treatment (OR 0.65, 95% CI 0.54–0.80, P < 0.001; I2 = 65%) and to de-escalation of therapy based on culture results (RR 0.44, 95% CI 0.30–0.66, P < 0.001; I2 = 59%), based on 40 and 25 studies, respectively [39]; but both results exhibited substantial heterogeneity (defined as I2 = 50%–90% [40]) among the relevant studies. Another meta-analysis found that there was no significant change in mortality related to stewardship interventions intending to improve antibiotic appropriateness (RR 0.92, 95% CI 0.69–1.2, P = 0.56; I2 = 72%) or intending to reduce excessive prescribing (RR 0.92, 95% CI 0.81–1.06, P = 0.25; I2 = 0%), but that there was a significant mortality benefit associated with interventions aimed at increasing guideline compliance for pneumonia diagnoses (RR 0.89, 95% CI 0.82–0.97, P = 0.005; I2 = 0%) [37]. In the case of Schuts et al [39], search criteria specifically sought studies that assessed clinical outcomes (eg, mortality), whereas the search of Davey et al [37] focused on studies whose aim was to improve antibiotic prescribing, with a main comparison being between restrictive and persuasive interventions; while the difference may seem subtle, the body of data compiled from these searches may characterize the ASP effect of mortality differently. No significant evidence was found to suggest that reduced antimicrobial consumption increases mortality.
Improving the use of antimicrobial agents should limit collateral damage associated with their use (eg, damage to normal flora and increased resistance), and ideally infections should be better managed. As previously mentioned, one of the concerns with antibiotic usage (particularly fluoroquinolones, macrolides, and broad-spectrum agents) is that collateral damage could lead to increased rates of C. difficile infection. One meta-analysis showed no significant reduction in the rate of C. difficile infection (as well as overall infection rate) relative to ASPs [22]; however, this finding was based on only 3 of the 26 studies analyzed, and only 1 of those 3 studies utilized restrictions for flouroquinolones and cephalosporins. An interrupted time series (ITS) study similarly found no significant reduction in C. difficile infection rate [32]; however, this study was conducted in a hospital with low baseline antibiotic prescribing (it was ranked second-to-last in terms of antibiotic usage among its peer institutions), inherently limiting the risk of C. difficile infection among patients in the pre-ASP setting. In contrast to these findings, a meta-analysis specifically designed to assess the incidence of C. difficile infection relative to stewardship programs found a significantly reduced risk of infection based on 16 studies (RR 0.48, 95% CI 0.38–0.62, P < 0.001; I2 = 76%) [41], and the systematic review conducted by Filice et al [24] found a significant benefit with regard to the C. difficile infection rate in 4 of 6 studies. These results are consistent with those presented as evidence for the impact of stewardship on C. difficile infection by the CDC [42]. Aside from C. difficile infection, one retrospective observational study found that the 14-day reinfection rate (ie, reinfection with the same infection at the same anatomical location) was significantly reduced following stewardship intervention (0% vs. 10%, P = 0.009) [29]. This finding, combined with the C. difficile infection examples, provide evidence for better infection management of ASPs.
While the general trend seems to suggest mixed or no significant benefit for several clinical outcomes, it is important to note that variation in outcomes could be due to differences in the types of ASP interventions and intervention study periods across differing programs. Davey et al [37] found variation in prescribing outcomes based on whether restrictive (ie, restrict prescriber freedom with antimicrobials) or persuasive (ie, suggest changes to prescriber) interventions were used, and on the timeframe in which they were used. At one month into an ASP, restrictive interventions resulted in better prescribing practices relative to persuasive interventions based on 27 studies (effect size 32.0%, 95% CI 2.5%–61.4%), but by 6 months the 2 were not statistically different (effect size 10.1%, 95% CI –47.5% to 66.0%). At 12 and 24 months, persuasive interventions demonstrated greater effects on prescribing outcomes, but these were not significant. These findings provide evidence that different study timeframes can impact ASP practices differently (and these already vary widely in the literature). Considering the variety of ASP interventions employed across the different studies, these factors almost certainly impact the reported antimicrobial consumption rates and outcomes to different degrees as a consequence. A high degree of heterogeneity among an analyzed dataset could itself be the reason for net non-significance within single systematic reviews and meta-analyses.
Resistance
Another goal of ASPs is the prevention of antimicrobial resistance, an area where the evidence generally suggests benefit associated with ASP interventions. Resistance rates to common troublesome organisms, such as methicillin-resistant S. aureus (MRSA), imipenem-resistant P. aeruginosa, and extended-spectrum β-lactamase (ESBL)–producing Klebsiella spp were significantly reduced in a meta-analysis; ESBL-producing E. coli infections were not, however [22]. An ITS study found significantly reduced MRSA resistance, as well as reduced Pseudomonal resistance to imipenem-cilastin and levofloxacin (all P < 0.001), but no significant changes with respect to piperacillin/tazobactam, cefepime, or amikacin resistance [32]. This study also noted increased E. coli resistance to levofloxacin and ceftriaxone (both P < 0.001). No significant changes in resistance were noted for vancomycin-resistant enterococci. It may be a reasonable expectation that decreasing inappropriate antimicrobial use may decrease long-term antimicrobial resistance; but as most studies only span a few years, only the minute changes in resistance are understood [23]. Longer duration studies are needed to better understand resistance outcomes.
Of note is a phenomenon known as the “squeezing the balloon” effect. This can be associated with ASPs, potentially resulting in paradoxically increased resistance [43]. That is, when usage restrictions are placed on certain antibiotics, the use of other non-restricted antibiotics may increase, possibly leading to increased resistance of those non-restricted antibiotics [22] (“constraining one end [of a balloon] causes the other end to bulge … limiting the use of one class of compounds may be counteracted by corresponding changes in prescribing and drug resistance that are even more ominous” [43]). Karanika et al [22] took this phenomonen into consideration, and assessed restricted and non-restricted antimicrobial consumption separately. They found a reduction in consumption for both restricted and non-restricted antibiotics, which included “high potential resistance” antibiotics, specifically carbapenems and glycopeptides. In the study conducted by Cairns et al [28], a similar effect was observed; while the use of other classes of antibiotics decreased (eg, cephalosporins and aminoglycosides), the use of β–lactam–β–lactamase inhibitor combinations actually increased by 48% (change in use: +48.2% [95% CI 21.8%–47.9%]). Hohn et al [26] noted an increased usage rate of carbapenems, even though several other classes of antibiotics had reduced usage. Unfortunately, neither study reported resistance rates, so the impact of these findings is unknown. Finally, Jenkins et al [32] assessed trends in antimicrobial use as changes in rates of consumption. Among the various antibiotics assessed in this study, the rate of flouroquinolone use decreased both before and after the intervention period, although the rate of decreased usage slowed post-ASP (the change in rate post-ASP was +2.2% [95% CI 1.4%–3.1%], P < 0.001). They observed a small (but significant) increase in resistance of E. coli to levofloxacin pre- vs. post-intervention (11.0% vs. 13.9%, P < 0.001); in contrast, a significant decrease in resistance of P. aeruginosa was observed (30.5% vs. 21.4%, P < 0.001). While these examples help illustrate the concept of changes in antibiotic usage patterns associated with an ASP, at best they approximate the “squeezing the balloon” effect since these studies present data for antibiotics that were either restricted or for which restriction was not clearly specified. The “squeezing the balloon” effect is most relevant for the unintended, potentially increased usage of non-restricted drugs secondary to ASP restrictions. Higher resistance rates among certain drug classes observed in the context of this effect would constitute a drawback to an ASP program.
Adverse Effects
Reduced toxicities and adverse effects are expected with reduced usage of antimicrobials. The systematic review conducted by Filice et al [24] examined the incidence of adverse effects related to antibiotic usage, and their findings suggest, at the least, that stewardship programs generally do not cause harm, as only 2 of the studies they examined reported adverse events. Following stewardship interventions, 5.5% of the patients deteriorated; and of those, the large majority (75%) deteriorated due to progression of oncological malignancies. To further illustrate the effect of stewardship interventions on toxicities and side effects of antimicrobials, Schuts et al demonstrated that the risk of nephrotoxicity while on antimicrobial therapy was reduced based on 14 studies of moderate heterogeneity as a result of an ASP (OR 0.46, 95% CI 0.28–0.77, P = 0.003; I2 = 34%) [39,44]. It is intuitive that reduced drug exposure results in reduced adverse effects, as such these results are expected.
Economic Outcomes
Although the focus of ASPs is often to improve clinical outcomes, economic outcomes are an important component of ASPs; these programs bring associated economic value that should be highlighted and further detailed [22,45,46]. Since clinical outcomes are often the main objective of ASPs, most available studies have been clinical effect studies (rather than economic analyses), in which economic assessments are often a secondary consideration, if included.
As a result, cost evaluations are conducted on direct cost reductions whereas indirect cost reductions are often not critically evaluated. ASPs reduce hospital expenditures by limiting hospital-acquired infections and the associated medical costs where they are effective at decreasing consumption of antimicrobials [22,45], and by reducing antibiotic misuse, iatrogenic infections, and the rates of antibiotic-resistant organisms [47]. In one retrospective observational study, annual costs of antibiotics dropped by 33% with re-implementation of an ASP, mirrored by an overall decrease in antibiotic consumption of about 10%, over the course of the intervention study period [30]. Of note is that at 1 year post-ASP re-implementation, antibiotic consumption actually increased (by 5.4%); however, because antibiotic usage had changed to more appropriate and cost-effective therapies, cost expenditures associated with antibiotics were still reduced by 13% for that year relative to pre-ASP re-implementation. Aside from economic evaluations centered on consumption rates, there is the potential to further evaluate economic benefits associated with stewardship when looking at other outcomes, including hospital LOS [22], as well as indirect costs such as morbidity and mortality, societal, and operational costs [46]. Currently, these detailed analyses are lacking. In conjunction with more standardized clinical metrics, these assessments are needed to better delineate the full cost effectiveness of ASPs.
Evidence Summary
The evidence for inpatient ASP effectiveness is promising but mixed. Much of the evidence is low-level, based on observational studies that are retrospective in nature, and systematic reviews and meta-analyses are based on these types of studies. Studies have been conducted over a range of years, and the duration of intervention periods often vary widely between studies; it is difficult to capture and account for all of the infection, prescribing, and drug availability patterns (as well as the intervention differences or new drug approvals) throughout these time periods. To complicate the matter, both the quality of data as well as the quality of the ASPs are highly variable.
As such, the findings across pooled studies for ASPs are hard to amalgamate and draw concrete conclusions from. This difficulty is due to the inherent heterogeneity when comparing smaller individual studies in systematic reviews and meta-analyses. Currently, there are numerous ways to implement an ASP, but there is not a standardized system of specific interventions or metrics. Until we can directly compare similar ASPs and interventions among various institutions, it will be challenging to generalize positive benefits from systematic reviews and meta-analyses. Currently, the CDC is involved in a new initiative in which data from various hospitals are compiled to create a surveillance database [48]. Although this is a step in the right direction for standardized metrics for stewardship, for the current review the lack of standard metrics leads to conflicting results of heterogenic studies, making it difficult to show clear benefits in clinical outcomes.
Despite the vast array of ASPs, their differences, and a range of clinical measures—many with conflicting evidence—there is a noticeable trend toward a more prudent use of antimicrobials. Based on the review of available evidence, inpatient ASPs improve patient care and preserve an important health care resource—antibiotics. As has been presented, this is demonstrated by the alterations in consumption of these agents, has ramifications for secondary outcomes such as reduced instances of C. difficile infections, resistance, and adverse effects, and overall translates into better patient care and reduced costs. But while we can conclude that the direct interventions of stewardship in reducing and restricting antibiotic use have been effective, we cannot clearly state the overall magnitude of benefit, the effectiveness of various ASP structures and components on clinical outcomes (such as LOS, mortality, etc.), and the cost savings due to the heterogeneity of the available evidence.
Future Directions
Moving forward, the future of ASPs encompasses several potential developments. First and foremost, as technological advancements continue to develop, there is a need to integrate and utilize developments in information technology (IT). Baysari et al conducted a review on the value of utilizing IT interventions, focusing mainly on decision support (stand-alone or as a component of other hospital procedures), approval, and surveillance systems [49]. There was benefit associated with these IT interventions in terms of the improvement in the appropriate use of antimicrobials (RR 1.49, 95% CI, 1.07–2.08, P < 0.05; I2 = 93%), but there was no demonstrated benefit in terms of patient mortality or hospital LOS. Aside from this study, broad evidence is still lacking to support the use of IT systems in ASPs because meaningful comparisons amongst the interventions have not been made due to widespread variability in study design and outcome measures. However, it is generally agreed that ASPs must integrate with IT systems as the widespread use of technology within the healthcare field continues to grow. Evidence needs to be provided in the form of higher quality studies centered on similar outcomes to show appropriate approaches for ASPs to leverage IT systems. At a minimum, the integration of IT into ASPs should not hinder clinical outcomes. An important consideration is the variation in practice settings where antibiotic stewardship is to be implemented; eg, a small community hospital will be less equipped to incorporate and support technological tools compared to a large tertiary teaching hospital. Therefore, any antibiotic stewardship IT intervention must be customized to meet local needs, prescriber behaviors, minimize barriers to implementation, and utilize available resources.
Another area of focus for future ASPs is the use of rapid diagnostics. Currently, when patients present with signs and symptoms of an infection, an empiric antimicrobial regimen is started that is then de-escalated as necessary; rapid testing will help to initiate appropriate therapy more quickly and increase antimicrobial effectiveness. Rapid tests range from rapid polymerase chain reaction (PCR)-based screening [50], to Verigene gram-positive blood culture (BC-GP) tests [51], next-generation sequencing methods, and matrix assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) [52]. Rapid diagnostic tools should be viewed as aides to assist ASPs in decreasing antibiotic consumption and improving patient outcomes; these various tools have been shown to improve clinical outcomes when integrated into ASPs, but offer little value addressing the goals of ASPs when used outside of stewardship programs and their sensitive timeframes [53].
In terms of future ASP expansion, stewardship implementation can become more unified and broad in scope. ASPs should expand to include antifungal interventions, an area which is showing progress [36]. ASPs can also be implemented in new areas throughout the hospital (eg, pediatrics and emergency room), as well as areas outside of the hospital setting, including long-term care facilities, dialysis centers, and other institutions [54–56]. A prospective randomized control study was conducted in 30 nursing homes to evaluate the use of a novel resident antimicrobial management plan (RAMP) for improved use of antimicrobials [57]. This study found that the RAMP had no associated adverse effects and suggests that ASP is an important tool in nursing homes. In addition, the general outpatient and pediatric settings show promise for ASPs [56,58,59], but more research is needed to support expansion and to identify how ASP interventions should be applied in these various practice settings. The antimicrobial stewardship interventions that will be utilized will need to be carefully delineated to consider the scale, underlying need, and potential challenges in those settings.
While the future of antibiotic stewardship is unclear, there is certainty that it will continue to develop in both scope and depth to encompass new areas of focus, new settings to improve outcomes, and employ new tools to refine approaches. An important first step for the continued development of ASPs is alignment and standardization, since without alignment it will continue to be difficult to compare outcomes. This issue is currently being addressed by a number of different organizations. With current support from the Joint Commission, the CDC, as well as the President’s Council of Advisors on Science and Technology (PCAST) [8], regulatory requirements for ASPs are well underway, and these drivers will appropriately position ASPs for further advancements. By reducing variability amongst ASPs and delineating implementation of ASPs, there can be a clear identification of both economic and clinical benefits associated with specific interventions.
Corresponding author: Luigi Brunetti, PharmD, MPH, Rutgers, The State University of New Jersey, 160 Frelinghuysen Rd., Piscataway, NJ 08854, [email protected].
Financial disclosures: None.
1. Barlam TF, Cosgrove SE, Abbo AM, et al. Implementing an antimicrobial stewardship program: guidelines by the Infectious Diseases Society of America and the Society of Healthcare Epidemiology of America. Clin Infect Dis 2016;62:e51–77.
2. Hughes D. Selection and evolution of resistance to antimicrobial drugs. IUBMB Life 2014;66:521–9.
3. World Health Organzation. The evolving threat of antimicrobial resistance – options for action. Geneva: WHO Press; 2012.
4. Gould IM, Bal AM. New antibiotic agents in the pipeline and how they can help overcome microbial resistance. Virulence 2013;4:185–91.
5. Davies J, Davies D. Origins and evolution of antibiotic resistance. Microbiol Mol Biol Rev 2010;74:417–33.
6. Owens RC Jr. Antimicrobial stewardship: concepts and strategies in the 21st century. Diagn Microbiol Infect Dis 2008;61:110–28.
7. Antibiotic resistance threats in the United States, 2013 [Internet]. Centers for Disease Control and Prevention. Available at www.cdc.gov/drugresistance/pdf/ar-threats-2013-508.pdf.
8. Nathan C, Cars O. Antibiotic resistance – problems, progress, prospects. N Engl J Med 2014;371:1761–3.
9. McGoldrick, M. Antimicrobial stewardship. Home Healthc Nurse 2014;32:559–60.
10. Ruedy J. A method of determining patterns of use of antibacterial drugs. Can Med Assoc J 1966;95:807–12.
11. Briceland LL, Nightingdale CH, Quintiliani R, et al. Antibiotic streamlining from combination therapy to monotherapy utilizing an interdisciplinary approach. Arch Inter Med 1988;148:2019–22.
12. McGowan JE Jr, Gerding DN. Does antibiotic restriction prevent resistance? New Horiz 1996;4: 370–6.
13. Cappelletty D, Jacobs D. Evaluating the impact of a pharmacist’s absence from an antimicrobial stewardship team. Am J Health Syst Pharm 2013;70:1065–69.
14. Shales DM, Gerding DN, John JF Jr, et al. Society for Healthcare Epidemiology of America and Infectious Diseases Society of America Joint Committee on the prevention of antimicrobial resistance: guidelines for the prevention of antimicrobial resistance in hospitals. Infect Control Hosp Epidemiol 1997;18:275–91.
15. Dellit TH, Owens RC, McGowan JE, et al. Infectious Diseases Society of America and the Society for Healthcare Epidemiology of America guidelines for developing an institutional program to enhance antimicrobial stewardship. Clin Infect Dis 2007;44:159–77.
16. Policy statement on antimicrobial stewardship by the Society for Healthcare Epidemiology of America (SHEA), the Infectious Diseases Society of America (IDSA), and the Pediatric Infectious Diseases Society (PIDS). Infect Ctrl Hosp Epidemiol 2012;33:322–7.
17. The Joint Commission. Approved: New antimicrobial stewardship standard. Joint Commission Perspectives 2016;36:1–8.
18. Pollack LA, Srinivasan A. Core elements of hospital antibiotic stewardship programs from the Centers for Disease Control and Prevention. Clin Infect Dis 2014;59(Suppl 3):S97–100.
19. Moody J. Infection preventionists have a role in accelerating progress toward preventing the emergence and cross-transmission of MDROs. Prevention Strategist 2012 Summer:52–6.
20. Spellberg B, Bartlett JG, Gilbert DN. The future of antibiotics and resistance. N Engl J Med 2013;368:299–302.
21. Olans RN, Olans RD, Demaria A. The critical role of the staff nurse in antimicrobial stewardship--unrecognized, but already there. Clin Infect Dis 2016;62:84–9.
22. Karanika S, Paudel S, Grigoras C, et al. Systematic review and meta-analysis of clinical and economic outcomes from the implementation of hospital-based antimicrobial stewardship programs. Antimicrob Agents Chemother 2016;60:4840–52.
23. Wagner B, Filice GA, Drekonja D, et al. Antimicrobial stewardship programs in inpatient hospital settings: a systematic review. Infect Control Hosp Epidemiol 2014;35:1209–28.
24. Filice G, Drekonja D, Greer N, et al. Antimicrobial stewardship programs in inpatient settings: a systematic review. VA-ESP Project #09-009; 2013.
25. Cairns KA, Doyle JS, Trevillyan JM, et al. The impact of a multidisciplinary antimicrobial stewardship team on the timeliness of antimicrobial therapy in patients with positive blood cultures: a randomized controlled trial. J Antimicrob Chemother 2016;71:3276–83.
26. Hohn A, Heising B, Hertel S, et al. Antibiotic consumption after implementation of a procalcitonin-guided antimicrobial stewardship programme in surgical patients admitted to an intensive care unit: a retrospective before-and-after analysis. Infection 2015;43:405–12.
27. Singh S, Zhang YZ, Chalkley S, et al. A three-point time series study of antibiotic usage on an intensive care unit, following an antibiotic stewardship programme, after an outbreak of multi-resistant Acinetobacter baumannii. Eur J Clin Microbiol Infect Dis 2015;34:1893–900.
28. Cairns KA, Jenney AW, Abbott IJ, et al. Prescribing trends before and after implementation of an antimicrobial stewardship program. Med J Aust 2013;198:262–6.
29. Liew YX, Lee W, Loh JC, et al. Impact of an antimicrobial stewardship programme on patient safety in Singapore General Hospital. Int J Antimicrob Agents 2012;40:55–60.
30. Bevilacqua S, Demoré B, Boschetti E, et al. 15 years of antibiotic stewardship policy in the Nancy Teaching Hospital. Med Mal Infect 2011;41:532–9.
31. Danaher PJ, Milazzo NA, Kerr KJ, et al. The antibiotic support team--a successful educational approach to antibiotic stewardship. Mil Med 2009;174:201–5.
32. Jenkins TC, Knepper BC, Shihadeh K, et al. Long-term outcomes of an antimicrobial stewardship program implemented in a hospital with low baseline antibiotic use. Infect Control Hosp Epidemiol 2015;36:664–72.
33. Brown KA, Khanafer N, Daneman N, Fisman DN. Meta-analysis of antibiotics and the risk of community-associated Clostridium difficile infection. Antimicrob Agents Chemother 2013;57:2326–32.
34. Deshpande A, Pasupuleti V, Thota P, et al. Community-associated Clostridium difficile infection and antibiotics: a meta-analysis. J Antimicrob Chemother 2013;68:1951–61.
35. Slimings C, Riley TV. Antibiotics and hospital-acquired Clostridium difficile infection: update of systematic review and meta-analysis. J Antimicrob Chemother 2014;69:881–91.
36. Antworth A, Collins CD, Kunapuli A, et al. Impact of an antimicrobial stewardship program comprehensive care bundle on management of candidemia. Pharmacotherapy 2013;33:137–43.
37. Davey P, Brown E, Charani E, et al. Interventions to improve antibiotic prescribing practices for hospital inpatients. Cochrane Database Syst Rev 2013;4:CD003543.
38. Pasquale TR, Trienski TL, Olexia DE, et al. Impact of an antimicrobial stewardship program on patients with acute bacterial skin and skin structure infections. Am J Health Syst Pharm 2014;71:1136–9.
39. Schuts EC, Hulscher ME, Mouton JW, et al. Current evidence on hospital antimicrobial stewardship objectives: a systematic review and meta-analysis. Lancet Infect Dis 2016;16:847–56.
40. Higgins JPT, Green S, editors. Identifying and measuring heterogeneity. Cochrane Handbook for Systematic Reviews of Interventions, version 5.1.0. [Internet]. The Cochrane Collaboration, March 2011. Available at http://handbook.cochrane.org/chapter_9/9_5_2_identifying_and_measuring_heterogeneity.htm.
41. Feazel LM, Malhotra A, Perencevich EN, et al. Effect of antibiotic stewardship programmes on Clostridium difficile incidence: a systematic review and meta-analysis. J Antimicrob Chemother 2014;69:1748–54.
42. Impact of antibiotic stewardship programs on Clostridium difficile (C. diff) infections [Internet]. Centers for Disease Control and Prevention. [Updated 2016 May 13; cited 2016 Oct 11]. Available at www.cdc.gov/getsmart/healthcare/evidence/asp-int-cdiff.html.
43. Burke JP. Antibiotic resistance – squeezing the balloon? JAMA 1998;280:1270–1.
44. This nephrotoxicity result is corrected from the originally published result; communicated by Jan M Prins on behalf of the authors for reference [39]. Prins, JM (Department of Internal Medicine, Division of Infectious Diseases, Academic Medical Centre, Amsterdam, Netherlands). Email communication with Joseph Eckart (Pharmacy Practice & Administration, Ernest Mario School of Pharmacy, Rutgers University, Piscataway, NJ). 2016 Oct 9.
45. Coulter S, Merollini K, Roberts JA, et al. The need for cost-effectiveness analyses of antimicrobial stewardship programmes: a structured review. Int J Antimicrob Agents 2015;46:140–9.
46. Dik J, Vemer P, Friedrich A, et al. Financial evaluations of antibiotic stewardship programs—a systematic review. Frontiers Microbiol 2015;6:317.
47. Campbell KA, Stein S, Looze C, Bosco JA. Antibiotic stewardship in orthopaedic surgery: principles and practice. J Am Acad Orthop Surg 2014;22:772–81.
48. Surveillance for antimicrobial use and antimicrobial resistance options, 2015 [Internet]. Centers for Disease Control and Prevention. [Updated 2016 May 3; cited 2016 Nov 22]. Available at www.cdc.gov/nhsn/acute-care-hospital/aur/index.html.
49. Baysari MT, Lehnbom EC, Li L, Hargreaves A, et al. The effectiveness of information technology to improve antimicrobial prescribing in hospitals: a systematic review and meta-analysis. Int J Med Inform. 2016;92:15-34.
50. Bauer KA, West JE, Balada-llasat JM, et al. An antimicrobial stewardship program’s impact with rapid polymerase chain reaction methicillin-resistant Staphylococcus aureus/S. aureus blood culture test in patients with S. aureus bacteremia. Clin Infect Dis 2010;51:1074–80.
51. Sango A, Mccarter YS, Johnson D, et al. Stewardship approach for optimizing antimicrobial therapy through use of a rapid microarray assay on blood cultures positive for Enterococcus species. J Clin Microbiol 2013;51:4008–11.
52. Perez KK, Olsen RJ, Musick WL, et al. Integrating rapid diagnostics and antimicrobial stewardship improves outcomes in patients with antibiotic-resistant Gram-negative bacteremia. J Infect 2014;69:216–25.
53. Bauer KA, Perez KK, Forrest GN, Goff DA. Review of rapid diagnostic tests used by antimicrobial stewardship programs. Clin Infect Dis 2014;59 Suppl 3:S134–145.
54. Dyar OJ, Pagani L, Pulcini C. Strategies and challenges of antimicrobial stewardship in long-term care facilities. Clin Microbiol Infect 2015;21:10–9.
55. D’Agata EM. Antimicrobial use and stewardship programs among dialysis centers. Semin Dial 2013;26:457–64.
56. Smith MJ, Gerber JS, Hersh AL. Inpatient antimicrobial stewardship in pediatrics: a systematic review. J Pediatric Infect Dis Soc 2015;4:e127–135.
57. Fleet E, Gopal Rao G, Patel B, et al. Impact of implementation of a novel antimicrobial stewardship tool on antibiotic use in nursing homes: a prospective cluster randomized control pilot study. J Antimicrob Chemother 2014;69:2265–73.
58. Drekonja DM, Filice GA, Greer N, et al. Antimicrobial stewardship in outpatient settings: a systematic review. Infect Control Hosp Epidemiol 2015;36:142–52.
59. Drekonja D, Filice G, Greer N, et al. Antimicrobial stewardship programs in outpatient settings: a systematic review. VA-ESP Project #09-009; 2014.
60. Zhang YZ, Singh S. Antibiotic stewardship programmes in intensive care units: why, how, and where are they leading us. World J Crit Care Med 2015;4:13–28. (referenced in online Table)
From the Ernest Mario School of Pharmacy, Rutgers, The State University of New Jersey, Piscataway, NJ.
Abstract
- Objective: To review the evidence evaluating inpatient antimicrobial stewardship programs (ASPs) with a focus on clinical and economic outcomes.
- Methods: Pubmed/MEDLINE and the Cochrane Database of Systematic Reviews were used to identify systematic reviews, meta-analyses, randomized controlled trials, and other relevant literature evaluating the clinical and economic impact of ASP interventions.
- Results: A total of 5 meta-analyses, 3 systematic reviews, and 10 clinical studies (2 randomized controlled, 5 observational, and 3 quasi-experimental studies) were identified for analysis. ASPs were associated with a reduction in antimicrobial consumption and use. However, due to the heterogeneity of outcomes measured among studies, the effectiveness of ASPs varied with the measures used. There are data supporting the cost savings associated with ASPs, but these studies are more sparse. Most of the available evidence supporting ASPs is of low quality, and intervention strategies vary widely among available studies.
- Conclusion: Much of the evidence reviewed supports the assertion that ASPs result in a more judicious use of antimicrobials and lead to better patient care in the inpatient setting. While clinical outcomes vary between programs, there are ubiquitous positive benefits associated with ASPs in terms of antimicrobial consumption, C. difficile infection rates, and resistance, with few adverse effects. To date, economic outcomes have been difficult to uniformly quantify, but there are data supporting the economic benefits of ASPs. As the number of ASPs continues to grow, it is imperative that standardized metrics are considered in order to accurately measure the benefits of these essential programs.
Key words: Antimicrobial stewardship; antimicrobial consumption; resistance.
Antimicrobial resistance is a public health concern that has been escalating over the years and is now identified as a global crisis [1–3]. This is partly due to the widespread use of the same antibiotics that have existed for decades, combined with a lack of sufficient novel antibiotic discovery and development [4]. Bacteria that are resistant to our last-line-of-defense medications have recently emerged, and these resistant organisms may spread to treatment-naive patients [5]. Multidrug-resistant organisms are often found, treated, and likely originate within the hospital practice setting, where antimicrobials can be prescribed by any licensed provider [6]. Upwards of 50% of antibiotics administered are unnecessary and contribute to the problem of increasing resistance [7]. The seriousness of this situation is increasingly apparent; in 2014 the World Health Organization (WHO), President Obama, and Prime Minister Cameron issued statements urging solutions to the resistance crisis [8].
While the urgency of the situation is recognized today, efforts aimed at a more judicious use of antibiotics to curb resistance began as early as the 1960s and led to the first antimicrobial stewardship programs (ASPs) [9–11]. ASPs have since been defined as “coordinated interventions designed to improve and measure the appropriate use of antimicrobial agents by promoting the selection of the optimal antimicrobial drug regimen including dosing, duration of therapy, and route of administration” [1]. The primary objectives of these types of programs are to avoid or reduce adverse events (eg, Clostridium difficile infection) and resistance driven by a shift in minimum inhibitory concentrations (MICs) and to reverse the unnecessary economic burden caused by the inappropriate prescribing of these agents [1].
This article examines the evidence evaluating the reported effectiveness of inpatient ASPs, examining both clinical and economic outcomes. In addition, we touch on ASP history, current status, and future directions in light of current trends. While ASPs are expanding into the outpatient and nursing home settings, we will limit our review here to the inpatient setting.
Historical Background
Modern antibiotics date back to the late 1930s when penicillin and sulfonamides were introduced to the medical market, and resistance to these drug classes was reported just a few years after their introduction. The same bacterial resistance mechanisms that neutralized their efficacy then exist today, and these mechanisms continue to confer resistance among those classes [5].
While “stewardship” was not described as such until the late 1990s [12], institutions have historically been proactive in creating standards around antimicrobial utilization to encourage judicious use of these agents. The earliest form of tracking antibiotic use was in the form of paper charts as “antibiotic logs” [9] and “punch cards” [10] in the 1960s. The idea of a team approach to stewardship dates back to the 1970s, with the example of Hartford Hospital in Hartford, Connecticut, which employed an antimicrobial standards model run by an infectious disease (ID) physician and clinical pharmacists [11]. In 1977, the Infectious Diseases Society of America (IDSA) released a statement that clinical pharmacists may have a substantial impact on patient care, including in ID, contributing to the idea that a team of physicians collaborating with pharmacists presents the best way to combat inappropriate medication use. Pharmacist involvement has since been shown to restrict broad overutilized antimicrobial agents and reduce the rate of C. difficile infection by a significant amount [13].
In 1997 the IDSA and the Society for Healthcare Epidemiology of America (SHEA) published guidelines to assist in the prevention of the growing issue of resistance, mentioning the importance of antimicrobial stewardship [14]. A decade later they released joint guidelines for ASP implementation [15], and the Pediatric Infectious Disease Society (PIDS) joined them in 2012 to publish a joint statement acknowledging and endorsing stewardship [16]. In 2014, the Centers of Disease Control and Prevention (CDC) recommended that every hospital should have an ASP. As of 1 January 2017, the Joint Commission requires an ASP as a standard for accreditation at hospitals, critical access hospitals, and nursing care [17]. Guidelines for implementation of an ASP are currently available through the IDSA and SHEA [1,16].
ASP Interventions
There are 2 main strategies that ASPs have to combat inappropriate antimicrobial use, and each has its own set of systematic interventions. These strategies are referred to as “prospective audit with intervention and feedback” and “prior authorization” [6]. Although most ASPs will incorporate these main strategies, each institution typically creates its own strategies and regulations independently.
Prospective audit with intervention and feedback describes the process of providing recommendations after reviewing utilization and trends of antimicrobial use. This is sometimes referred to as the “back-end” intervention, in which decisions are made after antibiotics have been administered. Interventions that are commonly used under this strategy include discontinuation of antibiotics due to culture data, de-escalation to drugs with narrower spectra, IV to oral conversions, and cessation of surgical prophylaxis [6].
Prior authorization, also referred to as a “front-end” intervention, is the process of approving medications before they are used. Interventions include a restricted formulary for antimicrobials that can be managed through a paging system or a built-in computer restriction program, as well as other guidelines and protocols for dosing and duration of therapy. Restrictions typically focus on broad spectrum antibiotics as well as the more costly drugs on formularies. These solutions reduce the need for manual intervention as technology makes it possible to create automated restriction-based services that prevent inappropriate prescribing [6].
Aside from these main techniques, other strategies are taken to achieve the goal of attaining optimal clinical outcomes while limiting further antimicrobial resistance and adverse effects. Different clinical settings have different needs, and ASPs are customized to each setting’s resources, prescribing habits, and other local specificities [1]. These differences present difficulty with interpreting diverse datasets, but certain themes arise in the literature: commonly assessed clinical outcomes of inpatient ASPs include hospital length of stay (LOS) and readmission, reinfection, mortality, and resistance rates. These outcomes are putatively driven by the more prudent use of antimicrobials, particularly by decreased rates of antimicrobial consumption.
ASP Team Members
While ASPs may differ between institutions, the staff members involved are typically the same, and leadership is always an important aspect of a program. The CDC recommends that ASP leadership consist of a program leader (an ID physician) and a pharmacy leader, who co-lead the team [18]. In addition, the Joint Commission recommends that the multidisciplinary team should include an infection preventionist (ie, infection control and hospital epidemiologist) and practitioner [17]; these specialists have a role in prevention, awareness, and policy [19]. The integration of infection control with stewardship yields the best results [15], as infection control aims to prevent antibiotic use altogether, while stewardship increases the quality of antibiotic regimens that are being prescribed [20].
It is also beneficial to incorporate a microbiologist as an integral part of the team, responsible for performing and interpreting laboratory data (ie, cultures). Nurses should be integrated into ASPs due to the overlap of their routine activities with ASP interventions [21]; other clinicians (regardless of their infectious disease clinical background), quality control, information technology, and environmental services should all collaborate in the hospital-wide systems related to the program where appropriate [18].
Evidence Review
Results
Antimicrobial Usage
The most widely studied aspect of ASPs in the current review was the effect of ASP interventions on antimicrobial consumption and use. Three systematic reviews [22–24] showed improved antibiotic prescribing practices and reduced consumption rates overall, as did several studies inside and outside the intensive care unit (ICU) [25–31].One study found an insignificant declining usage trend [32]. An important underlying facet of this observation is that even as total antibiotic consumption decreases, certain antibiotic and antibiotic class consumption may increase. This is evident in several studies, which showed that as aminoglycoside, carbapenem, and β-lactam-β-lactamase inhibitor use increased, clindamycin (1 case), glycopeptide, fluoroquinolone, and macrolide use decreased [27,28,30]. A potential confounding factor relating to decreased glycopeptide use in Bevilacqua et al [30] was that there was an epidemic of glycopeptide-resistant enterococci during the study period, potentially causing prescribers to naturally avoid it. In any case, since the aim of ASPs is to encourage a more judicious usage of antimicrobials, the observed decreases in consumption of those restricted medications is intuitive. These observations about antimicrobial consumption related to ASPs are relevant because they putatively drive improvements in clinical outcomes, especially those related to reduced adverse events associated with these agents, such as the risk of C. difficile infection with certain drugs (eg, fluoroquinolones, clindamycin, and broad-spectrum antibiotics) and prolonged antibiotic usage [33–35]. There is evidence that these benefits are not limited to antibiotics but extend to antifungal agents and possibly antivirals [22,27,36].
Utilization, Mortality, and Infection Rates
ASPs typically intend to improve patient-focused clinical parameters such as hospital LOS, hospital readmissions, mortality, and incidence of infections acquired secondary to antibiotic usage during a hospital stay, especially C. difficile infection. Most of the reviewed evidence indicates that there has been no significant LOS benefit due to stewardship interventions [24–26,32,37], and one meta-analysis noted that when overall hospital LOS was significantly reduced, ICU-specific LOS was not [22]. Generally, there was also not a significant change in hospital readmission rates [24,26,32]. However, 2 retrospective observational studies found mixed results for both LOS and readmission rates relative to ASP interventions; while both noted a significantly reduced LOS, one study [38] showed an all-cause readmission benefit in a fairly healthy patient population (but no benefit for readmissions due to the specific infections of interest), and the another [29] showed a benefit for readmissions due to infections but an increased rate of readmissions in the intervention group overall. In this latter study, hospitalizations within the previous 3 months were significantly higher at baseline for the intervention group (55% vs. 46%, P = 0.042), suggesting sicker patients and possibly providing an explanation for this unique observation. Even so, a meta-analysis of 5 studies found a significantly elevated risk of readmission associated with ASP interventions (RR 1.26, 95% CI 1.02–1.57; P = 0.03); the authors noted that non–infection-related readmissions accounted for 61% of readmissions, but this was not significantly different between intervention and non-intervention arms [37].
With regard to mortality, most studies found no significant reductions related to stewardship interventions [22,24,26,29,32]. In a prospective randomized controlled trial, all reported deaths (7/160, 4.4%) were in the ASP intervention arm, but these were attributed to the severities of infection or an underlying, chronic disease [25]. One meta-analysis, however, found that there were significant mortality reductions related to stewardship guidelines for empirical antibiotic treatment (OR 0.65, 95% CI 0.54–0.80, P < 0.001; I2 = 65%) and to de-escalation of therapy based on culture results (RR 0.44, 95% CI 0.30–0.66, P < 0.001; I2 = 59%), based on 40 and 25 studies, respectively [39]; but both results exhibited substantial heterogeneity (defined as I2 = 50%–90% [40]) among the relevant studies. Another meta-analysis found that there was no significant change in mortality related to stewardship interventions intending to improve antibiotic appropriateness (RR 0.92, 95% CI 0.69–1.2, P = 0.56; I2 = 72%) or intending to reduce excessive prescribing (RR 0.92, 95% CI 0.81–1.06, P = 0.25; I2 = 0%), but that there was a significant mortality benefit associated with interventions aimed at increasing guideline compliance for pneumonia diagnoses (RR 0.89, 95% CI 0.82–0.97, P = 0.005; I2 = 0%) [37]. In the case of Schuts et al [39], search criteria specifically sought studies that assessed clinical outcomes (eg, mortality), whereas the search of Davey et al [37] focused on studies whose aim was to improve antibiotic prescribing, with a main comparison being between restrictive and persuasive interventions; while the difference may seem subtle, the body of data compiled from these searches may characterize the ASP effect of mortality differently. No significant evidence was found to suggest that reduced antimicrobial consumption increases mortality.
Improving the use of antimicrobial agents should limit collateral damage associated with their use (eg, damage to normal flora and increased resistance), and ideally infections should be better managed. As previously mentioned, one of the concerns with antibiotic usage (particularly fluoroquinolones, macrolides, and broad-spectrum agents) is that collateral damage could lead to increased rates of C. difficile infection. One meta-analysis showed no significant reduction in the rate of C. difficile infection (as well as overall infection rate) relative to ASPs [22]; however, this finding was based on only 3 of the 26 studies analyzed, and only 1 of those 3 studies utilized restrictions for flouroquinolones and cephalosporins. An interrupted time series (ITS) study similarly found no significant reduction in C. difficile infection rate [32]; however, this study was conducted in a hospital with low baseline antibiotic prescribing (it was ranked second-to-last in terms of antibiotic usage among its peer institutions), inherently limiting the risk of C. difficile infection among patients in the pre-ASP setting. In contrast to these findings, a meta-analysis specifically designed to assess the incidence of C. difficile infection relative to stewardship programs found a significantly reduced risk of infection based on 16 studies (RR 0.48, 95% CI 0.38–0.62, P < 0.001; I2 = 76%) [41], and the systematic review conducted by Filice et al [24] found a significant benefit with regard to the C. difficile infection rate in 4 of 6 studies. These results are consistent with those presented as evidence for the impact of stewardship on C. difficile infection by the CDC [42]. Aside from C. difficile infection, one retrospective observational study found that the 14-day reinfection rate (ie, reinfection with the same infection at the same anatomical location) was significantly reduced following stewardship intervention (0% vs. 10%, P = 0.009) [29]. This finding, combined with the C. difficile infection examples, provide evidence for better infection management of ASPs.
While the general trend seems to suggest mixed or no significant benefit for several clinical outcomes, it is important to note that variation in outcomes could be due to differences in the types of ASP interventions and intervention study periods across differing programs. Davey et al [37] found variation in prescribing outcomes based on whether restrictive (ie, restrict prescriber freedom with antimicrobials) or persuasive (ie, suggest changes to prescriber) interventions were used, and on the timeframe in which they were used. At one month into an ASP, restrictive interventions resulted in better prescribing practices relative to persuasive interventions based on 27 studies (effect size 32.0%, 95% CI 2.5%–61.4%), but by 6 months the 2 were not statistically different (effect size 10.1%, 95% CI –47.5% to 66.0%). At 12 and 24 months, persuasive interventions demonstrated greater effects on prescribing outcomes, but these were not significant. These findings provide evidence that different study timeframes can impact ASP practices differently (and these already vary widely in the literature). Considering the variety of ASP interventions employed across the different studies, these factors almost certainly impact the reported antimicrobial consumption rates and outcomes to different degrees as a consequence. A high degree of heterogeneity among an analyzed dataset could itself be the reason for net non-significance within single systematic reviews and meta-analyses.
Resistance
Another goal of ASPs is the prevention of antimicrobial resistance, an area where the evidence generally suggests benefit associated with ASP interventions. Resistance rates to common troublesome organisms, such as methicillin-resistant S. aureus (MRSA), imipenem-resistant P. aeruginosa, and extended-spectrum β-lactamase (ESBL)–producing Klebsiella spp were significantly reduced in a meta-analysis; ESBL-producing E. coli infections were not, however [22]. An ITS study found significantly reduced MRSA resistance, as well as reduced Pseudomonal resistance to imipenem-cilastin and levofloxacin (all P < 0.001), but no significant changes with respect to piperacillin/tazobactam, cefepime, or amikacin resistance [32]. This study also noted increased E. coli resistance to levofloxacin and ceftriaxone (both P < 0.001). No significant changes in resistance were noted for vancomycin-resistant enterococci. It may be a reasonable expectation that decreasing inappropriate antimicrobial use may decrease long-term antimicrobial resistance; but as most studies only span a few years, only the minute changes in resistance are understood [23]. Longer duration studies are needed to better understand resistance outcomes.
Of note is a phenomenon known as the “squeezing the balloon” effect. This can be associated with ASPs, potentially resulting in paradoxically increased resistance [43]. That is, when usage restrictions are placed on certain antibiotics, the use of other non-restricted antibiotics may increase, possibly leading to increased resistance of those non-restricted antibiotics [22] (“constraining one end [of a balloon] causes the other end to bulge … limiting the use of one class of compounds may be counteracted by corresponding changes in prescribing and drug resistance that are even more ominous” [43]). Karanika et al [22] took this phenomonen into consideration, and assessed restricted and non-restricted antimicrobial consumption separately. They found a reduction in consumption for both restricted and non-restricted antibiotics, which included “high potential resistance” antibiotics, specifically carbapenems and glycopeptides. In the study conducted by Cairns et al [28], a similar effect was observed; while the use of other classes of antibiotics decreased (eg, cephalosporins and aminoglycosides), the use of β–lactam–β–lactamase inhibitor combinations actually increased by 48% (change in use: +48.2% [95% CI 21.8%–47.9%]). Hohn et al [26] noted an increased usage rate of carbapenems, even though several other classes of antibiotics had reduced usage. Unfortunately, neither study reported resistance rates, so the impact of these findings is unknown. Finally, Jenkins et al [32] assessed trends in antimicrobial use as changes in rates of consumption. Among the various antibiotics assessed in this study, the rate of flouroquinolone use decreased both before and after the intervention period, although the rate of decreased usage slowed post-ASP (the change in rate post-ASP was +2.2% [95% CI 1.4%–3.1%], P < 0.001). They observed a small (but significant) increase in resistance of E. coli to levofloxacin pre- vs. post-intervention (11.0% vs. 13.9%, P < 0.001); in contrast, a significant decrease in resistance of P. aeruginosa was observed (30.5% vs. 21.4%, P < 0.001). While these examples help illustrate the concept of changes in antibiotic usage patterns associated with an ASP, at best they approximate the “squeezing the balloon” effect since these studies present data for antibiotics that were either restricted or for which restriction was not clearly specified. The “squeezing the balloon” effect is most relevant for the unintended, potentially increased usage of non-restricted drugs secondary to ASP restrictions. Higher resistance rates among certain drug classes observed in the context of this effect would constitute a drawback to an ASP program.
Adverse Effects
Reduced toxicities and adverse effects are expected with reduced usage of antimicrobials. The systematic review conducted by Filice et al [24] examined the incidence of adverse effects related to antibiotic usage, and their findings suggest, at the least, that stewardship programs generally do not cause harm, as only 2 of the studies they examined reported adverse events. Following stewardship interventions, 5.5% of the patients deteriorated; and of those, the large majority (75%) deteriorated due to progression of oncological malignancies. To further illustrate the effect of stewardship interventions on toxicities and side effects of antimicrobials, Schuts et al demonstrated that the risk of nephrotoxicity while on antimicrobial therapy was reduced based on 14 studies of moderate heterogeneity as a result of an ASP (OR 0.46, 95% CI 0.28–0.77, P = 0.003; I2 = 34%) [39,44]. It is intuitive that reduced drug exposure results in reduced adverse effects, as such these results are expected.
Economic Outcomes
Although the focus of ASPs is often to improve clinical outcomes, economic outcomes are an important component of ASPs; these programs bring associated economic value that should be highlighted and further detailed [22,45,46]. Since clinical outcomes are often the main objective of ASPs, most available studies have been clinical effect studies (rather than economic analyses), in which economic assessments are often a secondary consideration, if included.
As a result, cost evaluations are conducted on direct cost reductions whereas indirect cost reductions are often not critically evaluated. ASPs reduce hospital expenditures by limiting hospital-acquired infections and the associated medical costs where they are effective at decreasing consumption of antimicrobials [22,45], and by reducing antibiotic misuse, iatrogenic infections, and the rates of antibiotic-resistant organisms [47]. In one retrospective observational study, annual costs of antibiotics dropped by 33% with re-implementation of an ASP, mirrored by an overall decrease in antibiotic consumption of about 10%, over the course of the intervention study period [30]. Of note is that at 1 year post-ASP re-implementation, antibiotic consumption actually increased (by 5.4%); however, because antibiotic usage had changed to more appropriate and cost-effective therapies, cost expenditures associated with antibiotics were still reduced by 13% for that year relative to pre-ASP re-implementation. Aside from economic evaluations centered on consumption rates, there is the potential to further evaluate economic benefits associated with stewardship when looking at other outcomes, including hospital LOS [22], as well as indirect costs such as morbidity and mortality, societal, and operational costs [46]. Currently, these detailed analyses are lacking. In conjunction with more standardized clinical metrics, these assessments are needed to better delineate the full cost effectiveness of ASPs.
Evidence Summary
The evidence for inpatient ASP effectiveness is promising but mixed. Much of the evidence is low-level, based on observational studies that are retrospective in nature, and systematic reviews and meta-analyses are based on these types of studies. Studies have been conducted over a range of years, and the duration of intervention periods often vary widely between studies; it is difficult to capture and account for all of the infection, prescribing, and drug availability patterns (as well as the intervention differences or new drug approvals) throughout these time periods. To complicate the matter, both the quality of data as well as the quality of the ASPs are highly variable.
As such, the findings across pooled studies for ASPs are hard to amalgamate and draw concrete conclusions from. This difficulty is due to the inherent heterogeneity when comparing smaller individual studies in systematic reviews and meta-analyses. Currently, there are numerous ways to implement an ASP, but there is not a standardized system of specific interventions or metrics. Until we can directly compare similar ASPs and interventions among various institutions, it will be challenging to generalize positive benefits from systematic reviews and meta-analyses. Currently, the CDC is involved in a new initiative in which data from various hospitals are compiled to create a surveillance database [48]. Although this is a step in the right direction for standardized metrics for stewardship, for the current review the lack of standard metrics leads to conflicting results of heterogenic studies, making it difficult to show clear benefits in clinical outcomes.
Despite the vast array of ASPs, their differences, and a range of clinical measures—many with conflicting evidence—there is a noticeable trend toward a more prudent use of antimicrobials. Based on the review of available evidence, inpatient ASPs improve patient care and preserve an important health care resource—antibiotics. As has been presented, this is demonstrated by the alterations in consumption of these agents, has ramifications for secondary outcomes such as reduced instances of C. difficile infections, resistance, and adverse effects, and overall translates into better patient care and reduced costs. But while we can conclude that the direct interventions of stewardship in reducing and restricting antibiotic use have been effective, we cannot clearly state the overall magnitude of benefit, the effectiveness of various ASP structures and components on clinical outcomes (such as LOS, mortality, etc.), and the cost savings due to the heterogeneity of the available evidence.
Future Directions
Moving forward, the future of ASPs encompasses several potential developments. First and foremost, as technological advancements continue to develop, there is a need to integrate and utilize developments in information technology (IT). Baysari et al conducted a review on the value of utilizing IT interventions, focusing mainly on decision support (stand-alone or as a component of other hospital procedures), approval, and surveillance systems [49]. There was benefit associated with these IT interventions in terms of the improvement in the appropriate use of antimicrobials (RR 1.49, 95% CI, 1.07–2.08, P < 0.05; I2 = 93%), but there was no demonstrated benefit in terms of patient mortality or hospital LOS. Aside from this study, broad evidence is still lacking to support the use of IT systems in ASPs because meaningful comparisons amongst the interventions have not been made due to widespread variability in study design and outcome measures. However, it is generally agreed that ASPs must integrate with IT systems as the widespread use of technology within the healthcare field continues to grow. Evidence needs to be provided in the form of higher quality studies centered on similar outcomes to show appropriate approaches for ASPs to leverage IT systems. At a minimum, the integration of IT into ASPs should not hinder clinical outcomes. An important consideration is the variation in practice settings where antibiotic stewardship is to be implemented; eg, a small community hospital will be less equipped to incorporate and support technological tools compared to a large tertiary teaching hospital. Therefore, any antibiotic stewardship IT intervention must be customized to meet local needs, prescriber behaviors, minimize barriers to implementation, and utilize available resources.
Another area of focus for future ASPs is the use of rapid diagnostics. Currently, when patients present with signs and symptoms of an infection, an empiric antimicrobial regimen is started that is then de-escalated as necessary; rapid testing will help to initiate appropriate therapy more quickly and increase antimicrobial effectiveness. Rapid tests range from rapid polymerase chain reaction (PCR)-based screening [50], to Verigene gram-positive blood culture (BC-GP) tests [51], next-generation sequencing methods, and matrix assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) [52]. Rapid diagnostic tools should be viewed as aides to assist ASPs in decreasing antibiotic consumption and improving patient outcomes; these various tools have been shown to improve clinical outcomes when integrated into ASPs, but offer little value addressing the goals of ASPs when used outside of stewardship programs and their sensitive timeframes [53].
In terms of future ASP expansion, stewardship implementation can become more unified and broad in scope. ASPs should expand to include antifungal interventions, an area which is showing progress [36]. ASPs can also be implemented in new areas throughout the hospital (eg, pediatrics and emergency room), as well as areas outside of the hospital setting, including long-term care facilities, dialysis centers, and other institutions [54–56]. A prospective randomized control study was conducted in 30 nursing homes to evaluate the use of a novel resident antimicrobial management plan (RAMP) for improved use of antimicrobials [57]. This study found that the RAMP had no associated adverse effects and suggests that ASP is an important tool in nursing homes. In addition, the general outpatient and pediatric settings show promise for ASPs [56,58,59], but more research is needed to support expansion and to identify how ASP interventions should be applied in these various practice settings. The antimicrobial stewardship interventions that will be utilized will need to be carefully delineated to consider the scale, underlying need, and potential challenges in those settings.
While the future of antibiotic stewardship is unclear, there is certainty that it will continue to develop in both scope and depth to encompass new areas of focus, new settings to improve outcomes, and employ new tools to refine approaches. An important first step for the continued development of ASPs is alignment and standardization, since without alignment it will continue to be difficult to compare outcomes. This issue is currently being addressed by a number of different organizations. With current support from the Joint Commission, the CDC, as well as the President’s Council of Advisors on Science and Technology (PCAST) [8], regulatory requirements for ASPs are well underway, and these drivers will appropriately position ASPs for further advancements. By reducing variability amongst ASPs and delineating implementation of ASPs, there can be a clear identification of both economic and clinical benefits associated with specific interventions.
Corresponding author: Luigi Brunetti, PharmD, MPH, Rutgers, The State University of New Jersey, 160 Frelinghuysen Rd., Piscataway, NJ 08854, [email protected].
Financial disclosures: None.
From the Ernest Mario School of Pharmacy, Rutgers, The State University of New Jersey, Piscataway, NJ.
Abstract
- Objective: To review the evidence evaluating inpatient antimicrobial stewardship programs (ASPs) with a focus on clinical and economic outcomes.
- Methods: Pubmed/MEDLINE and the Cochrane Database of Systematic Reviews were used to identify systematic reviews, meta-analyses, randomized controlled trials, and other relevant literature evaluating the clinical and economic impact of ASP interventions.
- Results: A total of 5 meta-analyses, 3 systematic reviews, and 10 clinical studies (2 randomized controlled, 5 observational, and 3 quasi-experimental studies) were identified for analysis. ASPs were associated with a reduction in antimicrobial consumption and use. However, due to the heterogeneity of outcomes measured among studies, the effectiveness of ASPs varied with the measures used. There are data supporting the cost savings associated with ASPs, but these studies are more sparse. Most of the available evidence supporting ASPs is of low quality, and intervention strategies vary widely among available studies.
- Conclusion: Much of the evidence reviewed supports the assertion that ASPs result in a more judicious use of antimicrobials and lead to better patient care in the inpatient setting. While clinical outcomes vary between programs, there are ubiquitous positive benefits associated with ASPs in terms of antimicrobial consumption, C. difficile infection rates, and resistance, with few adverse effects. To date, economic outcomes have been difficult to uniformly quantify, but there are data supporting the economic benefits of ASPs. As the number of ASPs continues to grow, it is imperative that standardized metrics are considered in order to accurately measure the benefits of these essential programs.
Key words: Antimicrobial stewardship; antimicrobial consumption; resistance.
Antimicrobial resistance is a public health concern that has been escalating over the years and is now identified as a global crisis [1–3]. This is partly due to the widespread use of the same antibiotics that have existed for decades, combined with a lack of sufficient novel antibiotic discovery and development [4]. Bacteria that are resistant to our last-line-of-defense medications have recently emerged, and these resistant organisms may spread to treatment-naive patients [5]. Multidrug-resistant organisms are often found, treated, and likely originate within the hospital practice setting, where antimicrobials can be prescribed by any licensed provider [6]. Upwards of 50% of antibiotics administered are unnecessary and contribute to the problem of increasing resistance [7]. The seriousness of this situation is increasingly apparent; in 2014 the World Health Organization (WHO), President Obama, and Prime Minister Cameron issued statements urging solutions to the resistance crisis [8].
While the urgency of the situation is recognized today, efforts aimed at a more judicious use of antibiotics to curb resistance began as early as the 1960s and led to the first antimicrobial stewardship programs (ASPs) [9–11]. ASPs have since been defined as “coordinated interventions designed to improve and measure the appropriate use of antimicrobial agents by promoting the selection of the optimal antimicrobial drug regimen including dosing, duration of therapy, and route of administration” [1]. The primary objectives of these types of programs are to avoid or reduce adverse events (eg, Clostridium difficile infection) and resistance driven by a shift in minimum inhibitory concentrations (MICs) and to reverse the unnecessary economic burden caused by the inappropriate prescribing of these agents [1].
This article examines the evidence evaluating the reported effectiveness of inpatient ASPs, examining both clinical and economic outcomes. In addition, we touch on ASP history, current status, and future directions in light of current trends. While ASPs are expanding into the outpatient and nursing home settings, we will limit our review here to the inpatient setting.
Historical Background
Modern antibiotics date back to the late 1930s when penicillin and sulfonamides were introduced to the medical market, and resistance to these drug classes was reported just a few years after their introduction. The same bacterial resistance mechanisms that neutralized their efficacy then exist today, and these mechanisms continue to confer resistance among those classes [5].
While “stewardship” was not described as such until the late 1990s [12], institutions have historically been proactive in creating standards around antimicrobial utilization to encourage judicious use of these agents. The earliest form of tracking antibiotic use was in the form of paper charts as “antibiotic logs” [9] and “punch cards” [10] in the 1960s. The idea of a team approach to stewardship dates back to the 1970s, with the example of Hartford Hospital in Hartford, Connecticut, which employed an antimicrobial standards model run by an infectious disease (ID) physician and clinical pharmacists [11]. In 1977, the Infectious Diseases Society of America (IDSA) released a statement that clinical pharmacists may have a substantial impact on patient care, including in ID, contributing to the idea that a team of physicians collaborating with pharmacists presents the best way to combat inappropriate medication use. Pharmacist involvement has since been shown to restrict broad overutilized antimicrobial agents and reduce the rate of C. difficile infection by a significant amount [13].
In 1997 the IDSA and the Society for Healthcare Epidemiology of America (SHEA) published guidelines to assist in the prevention of the growing issue of resistance, mentioning the importance of antimicrobial stewardship [14]. A decade later they released joint guidelines for ASP implementation [15], and the Pediatric Infectious Disease Society (PIDS) joined them in 2012 to publish a joint statement acknowledging and endorsing stewardship [16]. In 2014, the Centers of Disease Control and Prevention (CDC) recommended that every hospital should have an ASP. As of 1 January 2017, the Joint Commission requires an ASP as a standard for accreditation at hospitals, critical access hospitals, and nursing care [17]. Guidelines for implementation of an ASP are currently available through the IDSA and SHEA [1,16].
ASP Interventions
There are 2 main strategies that ASPs have to combat inappropriate antimicrobial use, and each has its own set of systematic interventions. These strategies are referred to as “prospective audit with intervention and feedback” and “prior authorization” [6]. Although most ASPs will incorporate these main strategies, each institution typically creates its own strategies and regulations independently.
Prospective audit with intervention and feedback describes the process of providing recommendations after reviewing utilization and trends of antimicrobial use. This is sometimes referred to as the “back-end” intervention, in which decisions are made after antibiotics have been administered. Interventions that are commonly used under this strategy include discontinuation of antibiotics due to culture data, de-escalation to drugs with narrower spectra, IV to oral conversions, and cessation of surgical prophylaxis [6].
Prior authorization, also referred to as a “front-end” intervention, is the process of approving medications before they are used. Interventions include a restricted formulary for antimicrobials that can be managed through a paging system or a built-in computer restriction program, as well as other guidelines and protocols for dosing and duration of therapy. Restrictions typically focus on broad spectrum antibiotics as well as the more costly drugs on formularies. These solutions reduce the need for manual intervention as technology makes it possible to create automated restriction-based services that prevent inappropriate prescribing [6].
Aside from these main techniques, other strategies are taken to achieve the goal of attaining optimal clinical outcomes while limiting further antimicrobial resistance and adverse effects. Different clinical settings have different needs, and ASPs are customized to each setting’s resources, prescribing habits, and other local specificities [1]. These differences present difficulty with interpreting diverse datasets, but certain themes arise in the literature: commonly assessed clinical outcomes of inpatient ASPs include hospital length of stay (LOS) and readmission, reinfection, mortality, and resistance rates. These outcomes are putatively driven by the more prudent use of antimicrobials, particularly by decreased rates of antimicrobial consumption.
ASP Team Members
While ASPs may differ between institutions, the staff members involved are typically the same, and leadership is always an important aspect of a program. The CDC recommends that ASP leadership consist of a program leader (an ID physician) and a pharmacy leader, who co-lead the team [18]. In addition, the Joint Commission recommends that the multidisciplinary team should include an infection preventionist (ie, infection control and hospital epidemiologist) and practitioner [17]; these specialists have a role in prevention, awareness, and policy [19]. The integration of infection control with stewardship yields the best results [15], as infection control aims to prevent antibiotic use altogether, while stewardship increases the quality of antibiotic regimens that are being prescribed [20].
It is also beneficial to incorporate a microbiologist as an integral part of the team, responsible for performing and interpreting laboratory data (ie, cultures). Nurses should be integrated into ASPs due to the overlap of their routine activities with ASP interventions [21]; other clinicians (regardless of their infectious disease clinical background), quality control, information technology, and environmental services should all collaborate in the hospital-wide systems related to the program where appropriate [18].
Evidence Review
Results
Antimicrobial Usage
The most widely studied aspect of ASPs in the current review was the effect of ASP interventions on antimicrobial consumption and use. Three systematic reviews [22–24] showed improved antibiotic prescribing practices and reduced consumption rates overall, as did several studies inside and outside the intensive care unit (ICU) [25–31].One study found an insignificant declining usage trend [32]. An important underlying facet of this observation is that even as total antibiotic consumption decreases, certain antibiotic and antibiotic class consumption may increase. This is evident in several studies, which showed that as aminoglycoside, carbapenem, and β-lactam-β-lactamase inhibitor use increased, clindamycin (1 case), glycopeptide, fluoroquinolone, and macrolide use decreased [27,28,30]. A potential confounding factor relating to decreased glycopeptide use in Bevilacqua et al [30] was that there was an epidemic of glycopeptide-resistant enterococci during the study period, potentially causing prescribers to naturally avoid it. In any case, since the aim of ASPs is to encourage a more judicious usage of antimicrobials, the observed decreases in consumption of those restricted medications is intuitive. These observations about antimicrobial consumption related to ASPs are relevant because they putatively drive improvements in clinical outcomes, especially those related to reduced adverse events associated with these agents, such as the risk of C. difficile infection with certain drugs (eg, fluoroquinolones, clindamycin, and broad-spectrum antibiotics) and prolonged antibiotic usage [33–35]. There is evidence that these benefits are not limited to antibiotics but extend to antifungal agents and possibly antivirals [22,27,36].
Utilization, Mortality, and Infection Rates
ASPs typically intend to improve patient-focused clinical parameters such as hospital LOS, hospital readmissions, mortality, and incidence of infections acquired secondary to antibiotic usage during a hospital stay, especially C. difficile infection. Most of the reviewed evidence indicates that there has been no significant LOS benefit due to stewardship interventions [24–26,32,37], and one meta-analysis noted that when overall hospital LOS was significantly reduced, ICU-specific LOS was not [22]. Generally, there was also not a significant change in hospital readmission rates [24,26,32]. However, 2 retrospective observational studies found mixed results for both LOS and readmission rates relative to ASP interventions; while both noted a significantly reduced LOS, one study [38] showed an all-cause readmission benefit in a fairly healthy patient population (but no benefit for readmissions due to the specific infections of interest), and the another [29] showed a benefit for readmissions due to infections but an increased rate of readmissions in the intervention group overall. In this latter study, hospitalizations within the previous 3 months were significantly higher at baseline for the intervention group (55% vs. 46%, P = 0.042), suggesting sicker patients and possibly providing an explanation for this unique observation. Even so, a meta-analysis of 5 studies found a significantly elevated risk of readmission associated with ASP interventions (RR 1.26, 95% CI 1.02–1.57; P = 0.03); the authors noted that non–infection-related readmissions accounted for 61% of readmissions, but this was not significantly different between intervention and non-intervention arms [37].
With regard to mortality, most studies found no significant reductions related to stewardship interventions [22,24,26,29,32]. In a prospective randomized controlled trial, all reported deaths (7/160, 4.4%) were in the ASP intervention arm, but these were attributed to the severities of infection or an underlying, chronic disease [25]. One meta-analysis, however, found that there were significant mortality reductions related to stewardship guidelines for empirical antibiotic treatment (OR 0.65, 95% CI 0.54–0.80, P < 0.001; I2 = 65%) and to de-escalation of therapy based on culture results (RR 0.44, 95% CI 0.30–0.66, P < 0.001; I2 = 59%), based on 40 and 25 studies, respectively [39]; but both results exhibited substantial heterogeneity (defined as I2 = 50%–90% [40]) among the relevant studies. Another meta-analysis found that there was no significant change in mortality related to stewardship interventions intending to improve antibiotic appropriateness (RR 0.92, 95% CI 0.69–1.2, P = 0.56; I2 = 72%) or intending to reduce excessive prescribing (RR 0.92, 95% CI 0.81–1.06, P = 0.25; I2 = 0%), but that there was a significant mortality benefit associated with interventions aimed at increasing guideline compliance for pneumonia diagnoses (RR 0.89, 95% CI 0.82–0.97, P = 0.005; I2 = 0%) [37]. In the case of Schuts et al [39], search criteria specifically sought studies that assessed clinical outcomes (eg, mortality), whereas the search of Davey et al [37] focused on studies whose aim was to improve antibiotic prescribing, with a main comparison being between restrictive and persuasive interventions; while the difference may seem subtle, the body of data compiled from these searches may characterize the ASP effect of mortality differently. No significant evidence was found to suggest that reduced antimicrobial consumption increases mortality.
Improving the use of antimicrobial agents should limit collateral damage associated with their use (eg, damage to normal flora and increased resistance), and ideally infections should be better managed. As previously mentioned, one of the concerns with antibiotic usage (particularly fluoroquinolones, macrolides, and broad-spectrum agents) is that collateral damage could lead to increased rates of C. difficile infection. One meta-analysis showed no significant reduction in the rate of C. difficile infection (as well as overall infection rate) relative to ASPs [22]; however, this finding was based on only 3 of the 26 studies analyzed, and only 1 of those 3 studies utilized restrictions for flouroquinolones and cephalosporins. An interrupted time series (ITS) study similarly found no significant reduction in C. difficile infection rate [32]; however, this study was conducted in a hospital with low baseline antibiotic prescribing (it was ranked second-to-last in terms of antibiotic usage among its peer institutions), inherently limiting the risk of C. difficile infection among patients in the pre-ASP setting. In contrast to these findings, a meta-analysis specifically designed to assess the incidence of C. difficile infection relative to stewardship programs found a significantly reduced risk of infection based on 16 studies (RR 0.48, 95% CI 0.38–0.62, P < 0.001; I2 = 76%) [41], and the systematic review conducted by Filice et al [24] found a significant benefit with regard to the C. difficile infection rate in 4 of 6 studies. These results are consistent with those presented as evidence for the impact of stewardship on C. difficile infection by the CDC [42]. Aside from C. difficile infection, one retrospective observational study found that the 14-day reinfection rate (ie, reinfection with the same infection at the same anatomical location) was significantly reduced following stewardship intervention (0% vs. 10%, P = 0.009) [29]. This finding, combined with the C. difficile infection examples, provide evidence for better infection management of ASPs.
While the general trend seems to suggest mixed or no significant benefit for several clinical outcomes, it is important to note that variation in outcomes could be due to differences in the types of ASP interventions and intervention study periods across differing programs. Davey et al [37] found variation in prescribing outcomes based on whether restrictive (ie, restrict prescriber freedom with antimicrobials) or persuasive (ie, suggest changes to prescriber) interventions were used, and on the timeframe in which they were used. At one month into an ASP, restrictive interventions resulted in better prescribing practices relative to persuasive interventions based on 27 studies (effect size 32.0%, 95% CI 2.5%–61.4%), but by 6 months the 2 were not statistically different (effect size 10.1%, 95% CI –47.5% to 66.0%). At 12 and 24 months, persuasive interventions demonstrated greater effects on prescribing outcomes, but these were not significant. These findings provide evidence that different study timeframes can impact ASP practices differently (and these already vary widely in the literature). Considering the variety of ASP interventions employed across the different studies, these factors almost certainly impact the reported antimicrobial consumption rates and outcomes to different degrees as a consequence. A high degree of heterogeneity among an analyzed dataset could itself be the reason for net non-significance within single systematic reviews and meta-analyses.
Resistance
Another goal of ASPs is the prevention of antimicrobial resistance, an area where the evidence generally suggests benefit associated with ASP interventions. Resistance rates to common troublesome organisms, such as methicillin-resistant S. aureus (MRSA), imipenem-resistant P. aeruginosa, and extended-spectrum β-lactamase (ESBL)–producing Klebsiella spp were significantly reduced in a meta-analysis; ESBL-producing E. coli infections were not, however [22]. An ITS study found significantly reduced MRSA resistance, as well as reduced Pseudomonal resistance to imipenem-cilastin and levofloxacin (all P < 0.001), but no significant changes with respect to piperacillin/tazobactam, cefepime, or amikacin resistance [32]. This study also noted increased E. coli resistance to levofloxacin and ceftriaxone (both P < 0.001). No significant changes in resistance were noted for vancomycin-resistant enterococci. It may be a reasonable expectation that decreasing inappropriate antimicrobial use may decrease long-term antimicrobial resistance; but as most studies only span a few years, only the minute changes in resistance are understood [23]. Longer duration studies are needed to better understand resistance outcomes.
Of note is a phenomenon known as the “squeezing the balloon” effect. This can be associated with ASPs, potentially resulting in paradoxically increased resistance [43]. That is, when usage restrictions are placed on certain antibiotics, the use of other non-restricted antibiotics may increase, possibly leading to increased resistance of those non-restricted antibiotics [22] (“constraining one end [of a balloon] causes the other end to bulge … limiting the use of one class of compounds may be counteracted by corresponding changes in prescribing and drug resistance that are even more ominous” [43]). Karanika et al [22] took this phenomonen into consideration, and assessed restricted and non-restricted antimicrobial consumption separately. They found a reduction in consumption for both restricted and non-restricted antibiotics, which included “high potential resistance” antibiotics, specifically carbapenems and glycopeptides. In the study conducted by Cairns et al [28], a similar effect was observed; while the use of other classes of antibiotics decreased (eg, cephalosporins and aminoglycosides), the use of β–lactam–β–lactamase inhibitor combinations actually increased by 48% (change in use: +48.2% [95% CI 21.8%–47.9%]). Hohn et al [26] noted an increased usage rate of carbapenems, even though several other classes of antibiotics had reduced usage. Unfortunately, neither study reported resistance rates, so the impact of these findings is unknown. Finally, Jenkins et al [32] assessed trends in antimicrobial use as changes in rates of consumption. Among the various antibiotics assessed in this study, the rate of flouroquinolone use decreased both before and after the intervention period, although the rate of decreased usage slowed post-ASP (the change in rate post-ASP was +2.2% [95% CI 1.4%–3.1%], P < 0.001). They observed a small (but significant) increase in resistance of E. coli to levofloxacin pre- vs. post-intervention (11.0% vs. 13.9%, P < 0.001); in contrast, a significant decrease in resistance of P. aeruginosa was observed (30.5% vs. 21.4%, P < 0.001). While these examples help illustrate the concept of changes in antibiotic usage patterns associated with an ASP, at best they approximate the “squeezing the balloon” effect since these studies present data for antibiotics that were either restricted or for which restriction was not clearly specified. The “squeezing the balloon” effect is most relevant for the unintended, potentially increased usage of non-restricted drugs secondary to ASP restrictions. Higher resistance rates among certain drug classes observed in the context of this effect would constitute a drawback to an ASP program.
Adverse Effects
Reduced toxicities and adverse effects are expected with reduced usage of antimicrobials. The systematic review conducted by Filice et al [24] examined the incidence of adverse effects related to antibiotic usage, and their findings suggest, at the least, that stewardship programs generally do not cause harm, as only 2 of the studies they examined reported adverse events. Following stewardship interventions, 5.5% of the patients deteriorated; and of those, the large majority (75%) deteriorated due to progression of oncological malignancies. To further illustrate the effect of stewardship interventions on toxicities and side effects of antimicrobials, Schuts et al demonstrated that the risk of nephrotoxicity while on antimicrobial therapy was reduced based on 14 studies of moderate heterogeneity as a result of an ASP (OR 0.46, 95% CI 0.28–0.77, P = 0.003; I2 = 34%) [39,44]. It is intuitive that reduced drug exposure results in reduced adverse effects, as such these results are expected.
Economic Outcomes
Although the focus of ASPs is often to improve clinical outcomes, economic outcomes are an important component of ASPs; these programs bring associated economic value that should be highlighted and further detailed [22,45,46]. Since clinical outcomes are often the main objective of ASPs, most available studies have been clinical effect studies (rather than economic analyses), in which economic assessments are often a secondary consideration, if included.
As a result, cost evaluations are conducted on direct cost reductions whereas indirect cost reductions are often not critically evaluated. ASPs reduce hospital expenditures by limiting hospital-acquired infections and the associated medical costs where they are effective at decreasing consumption of antimicrobials [22,45], and by reducing antibiotic misuse, iatrogenic infections, and the rates of antibiotic-resistant organisms [47]. In one retrospective observational study, annual costs of antibiotics dropped by 33% with re-implementation of an ASP, mirrored by an overall decrease in antibiotic consumption of about 10%, over the course of the intervention study period [30]. Of note is that at 1 year post-ASP re-implementation, antibiotic consumption actually increased (by 5.4%); however, because antibiotic usage had changed to more appropriate and cost-effective therapies, cost expenditures associated with antibiotics were still reduced by 13% for that year relative to pre-ASP re-implementation. Aside from economic evaluations centered on consumption rates, there is the potential to further evaluate economic benefits associated with stewardship when looking at other outcomes, including hospital LOS [22], as well as indirect costs such as morbidity and mortality, societal, and operational costs [46]. Currently, these detailed analyses are lacking. In conjunction with more standardized clinical metrics, these assessments are needed to better delineate the full cost effectiveness of ASPs.
Evidence Summary
The evidence for inpatient ASP effectiveness is promising but mixed. Much of the evidence is low-level, based on observational studies that are retrospective in nature, and systematic reviews and meta-analyses are based on these types of studies. Studies have been conducted over a range of years, and the duration of intervention periods often vary widely between studies; it is difficult to capture and account for all of the infection, prescribing, and drug availability patterns (as well as the intervention differences or new drug approvals) throughout these time periods. To complicate the matter, both the quality of data as well as the quality of the ASPs are highly variable.
As such, the findings across pooled studies for ASPs are hard to amalgamate and draw concrete conclusions from. This difficulty is due to the inherent heterogeneity when comparing smaller individual studies in systematic reviews and meta-analyses. Currently, there are numerous ways to implement an ASP, but there is not a standardized system of specific interventions or metrics. Until we can directly compare similar ASPs and interventions among various institutions, it will be challenging to generalize positive benefits from systematic reviews and meta-analyses. Currently, the CDC is involved in a new initiative in which data from various hospitals are compiled to create a surveillance database [48]. Although this is a step in the right direction for standardized metrics for stewardship, for the current review the lack of standard metrics leads to conflicting results of heterogenic studies, making it difficult to show clear benefits in clinical outcomes.
Despite the vast array of ASPs, their differences, and a range of clinical measures—many with conflicting evidence—there is a noticeable trend toward a more prudent use of antimicrobials. Based on the review of available evidence, inpatient ASPs improve patient care and preserve an important health care resource—antibiotics. As has been presented, this is demonstrated by the alterations in consumption of these agents, has ramifications for secondary outcomes such as reduced instances of C. difficile infections, resistance, and adverse effects, and overall translates into better patient care and reduced costs. But while we can conclude that the direct interventions of stewardship in reducing and restricting antibiotic use have been effective, we cannot clearly state the overall magnitude of benefit, the effectiveness of various ASP structures and components on clinical outcomes (such as LOS, mortality, etc.), and the cost savings due to the heterogeneity of the available evidence.
Future Directions
Moving forward, the future of ASPs encompasses several potential developments. First and foremost, as technological advancements continue to develop, there is a need to integrate and utilize developments in information technology (IT). Baysari et al conducted a review on the value of utilizing IT interventions, focusing mainly on decision support (stand-alone or as a component of other hospital procedures), approval, and surveillance systems [49]. There was benefit associated with these IT interventions in terms of the improvement in the appropriate use of antimicrobials (RR 1.49, 95% CI, 1.07–2.08, P < 0.05; I2 = 93%), but there was no demonstrated benefit in terms of patient mortality or hospital LOS. Aside from this study, broad evidence is still lacking to support the use of IT systems in ASPs because meaningful comparisons amongst the interventions have not been made due to widespread variability in study design and outcome measures. However, it is generally agreed that ASPs must integrate with IT systems as the widespread use of technology within the healthcare field continues to grow. Evidence needs to be provided in the form of higher quality studies centered on similar outcomes to show appropriate approaches for ASPs to leverage IT systems. At a minimum, the integration of IT into ASPs should not hinder clinical outcomes. An important consideration is the variation in practice settings where antibiotic stewardship is to be implemented; eg, a small community hospital will be less equipped to incorporate and support technological tools compared to a large tertiary teaching hospital. Therefore, any antibiotic stewardship IT intervention must be customized to meet local needs, prescriber behaviors, minimize barriers to implementation, and utilize available resources.
Another area of focus for future ASPs is the use of rapid diagnostics. Currently, when patients present with signs and symptoms of an infection, an empiric antimicrobial regimen is started that is then de-escalated as necessary; rapid testing will help to initiate appropriate therapy more quickly and increase antimicrobial effectiveness. Rapid tests range from rapid polymerase chain reaction (PCR)-based screening [50], to Verigene gram-positive blood culture (BC-GP) tests [51], next-generation sequencing methods, and matrix assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) [52]. Rapid diagnostic tools should be viewed as aides to assist ASPs in decreasing antibiotic consumption and improving patient outcomes; these various tools have been shown to improve clinical outcomes when integrated into ASPs, but offer little value addressing the goals of ASPs when used outside of stewardship programs and their sensitive timeframes [53].
In terms of future ASP expansion, stewardship implementation can become more unified and broad in scope. ASPs should expand to include antifungal interventions, an area which is showing progress [36]. ASPs can also be implemented in new areas throughout the hospital (eg, pediatrics and emergency room), as well as areas outside of the hospital setting, including long-term care facilities, dialysis centers, and other institutions [54–56]. A prospective randomized control study was conducted in 30 nursing homes to evaluate the use of a novel resident antimicrobial management plan (RAMP) for improved use of antimicrobials [57]. This study found that the RAMP had no associated adverse effects and suggests that ASP is an important tool in nursing homes. In addition, the general outpatient and pediatric settings show promise for ASPs [56,58,59], but more research is needed to support expansion and to identify how ASP interventions should be applied in these various practice settings. The antimicrobial stewardship interventions that will be utilized will need to be carefully delineated to consider the scale, underlying need, and potential challenges in those settings.
While the future of antibiotic stewardship is unclear, there is certainty that it will continue to develop in both scope and depth to encompass new areas of focus, new settings to improve outcomes, and employ new tools to refine approaches. An important first step for the continued development of ASPs is alignment and standardization, since without alignment it will continue to be difficult to compare outcomes. This issue is currently being addressed by a number of different organizations. With current support from the Joint Commission, the CDC, as well as the President’s Council of Advisors on Science and Technology (PCAST) [8], regulatory requirements for ASPs are well underway, and these drivers will appropriately position ASPs for further advancements. By reducing variability amongst ASPs and delineating implementation of ASPs, there can be a clear identification of both economic and clinical benefits associated with specific interventions.
Corresponding author: Luigi Brunetti, PharmD, MPH, Rutgers, The State University of New Jersey, 160 Frelinghuysen Rd., Piscataway, NJ 08854, [email protected].
Financial disclosures: None.
1. Barlam TF, Cosgrove SE, Abbo AM, et al. Implementing an antimicrobial stewardship program: guidelines by the Infectious Diseases Society of America and the Society of Healthcare Epidemiology of America. Clin Infect Dis 2016;62:e51–77.
2. Hughes D. Selection and evolution of resistance to antimicrobial drugs. IUBMB Life 2014;66:521–9.
3. World Health Organzation. The evolving threat of antimicrobial resistance – options for action. Geneva: WHO Press; 2012.
4. Gould IM, Bal AM. New antibiotic agents in the pipeline and how they can help overcome microbial resistance. Virulence 2013;4:185–91.
5. Davies J, Davies D. Origins and evolution of antibiotic resistance. Microbiol Mol Biol Rev 2010;74:417–33.
6. Owens RC Jr. Antimicrobial stewardship: concepts and strategies in the 21st century. Diagn Microbiol Infect Dis 2008;61:110–28.
7. Antibiotic resistance threats in the United States, 2013 [Internet]. Centers for Disease Control and Prevention. Available at www.cdc.gov/drugresistance/pdf/ar-threats-2013-508.pdf.
8. Nathan C, Cars O. Antibiotic resistance – problems, progress, prospects. N Engl J Med 2014;371:1761–3.
9. McGoldrick, M. Antimicrobial stewardship. Home Healthc Nurse 2014;32:559–60.
10. Ruedy J. A method of determining patterns of use of antibacterial drugs. Can Med Assoc J 1966;95:807–12.
11. Briceland LL, Nightingdale CH, Quintiliani R, et al. Antibiotic streamlining from combination therapy to monotherapy utilizing an interdisciplinary approach. Arch Inter Med 1988;148:2019–22.
12. McGowan JE Jr, Gerding DN. Does antibiotic restriction prevent resistance? New Horiz 1996;4: 370–6.
13. Cappelletty D, Jacobs D. Evaluating the impact of a pharmacist’s absence from an antimicrobial stewardship team. Am J Health Syst Pharm 2013;70:1065–69.
14. Shales DM, Gerding DN, John JF Jr, et al. Society for Healthcare Epidemiology of America and Infectious Diseases Society of America Joint Committee on the prevention of antimicrobial resistance: guidelines for the prevention of antimicrobial resistance in hospitals. Infect Control Hosp Epidemiol 1997;18:275–91.
15. Dellit TH, Owens RC, McGowan JE, et al. Infectious Diseases Society of America and the Society for Healthcare Epidemiology of America guidelines for developing an institutional program to enhance antimicrobial stewardship. Clin Infect Dis 2007;44:159–77.
16. Policy statement on antimicrobial stewardship by the Society for Healthcare Epidemiology of America (SHEA), the Infectious Diseases Society of America (IDSA), and the Pediatric Infectious Diseases Society (PIDS). Infect Ctrl Hosp Epidemiol 2012;33:322–7.
17. The Joint Commission. Approved: New antimicrobial stewardship standard. Joint Commission Perspectives 2016;36:1–8.
18. Pollack LA, Srinivasan A. Core elements of hospital antibiotic stewardship programs from the Centers for Disease Control and Prevention. Clin Infect Dis 2014;59(Suppl 3):S97–100.
19. Moody J. Infection preventionists have a role in accelerating progress toward preventing the emergence and cross-transmission of MDROs. Prevention Strategist 2012 Summer:52–6.
20. Spellberg B, Bartlett JG, Gilbert DN. The future of antibiotics and resistance. N Engl J Med 2013;368:299–302.
21. Olans RN, Olans RD, Demaria A. The critical role of the staff nurse in antimicrobial stewardship--unrecognized, but already there. Clin Infect Dis 2016;62:84–9.
22. Karanika S, Paudel S, Grigoras C, et al. Systematic review and meta-analysis of clinical and economic outcomes from the implementation of hospital-based antimicrobial stewardship programs. Antimicrob Agents Chemother 2016;60:4840–52.
23. Wagner B, Filice GA, Drekonja D, et al. Antimicrobial stewardship programs in inpatient hospital settings: a systematic review. Infect Control Hosp Epidemiol 2014;35:1209–28.
24. Filice G, Drekonja D, Greer N, et al. Antimicrobial stewardship programs in inpatient settings: a systematic review. VA-ESP Project #09-009; 2013.
25. Cairns KA, Doyle JS, Trevillyan JM, et al. The impact of a multidisciplinary antimicrobial stewardship team on the timeliness of antimicrobial therapy in patients with positive blood cultures: a randomized controlled trial. J Antimicrob Chemother 2016;71:3276–83.
26. Hohn A, Heising B, Hertel S, et al. Antibiotic consumption after implementation of a procalcitonin-guided antimicrobial stewardship programme in surgical patients admitted to an intensive care unit: a retrospective before-and-after analysis. Infection 2015;43:405–12.
27. Singh S, Zhang YZ, Chalkley S, et al. A three-point time series study of antibiotic usage on an intensive care unit, following an antibiotic stewardship programme, after an outbreak of multi-resistant Acinetobacter baumannii. Eur J Clin Microbiol Infect Dis 2015;34:1893–900.
28. Cairns KA, Jenney AW, Abbott IJ, et al. Prescribing trends before and after implementation of an antimicrobial stewardship program. Med J Aust 2013;198:262–6.
29. Liew YX, Lee W, Loh JC, et al. Impact of an antimicrobial stewardship programme on patient safety in Singapore General Hospital. Int J Antimicrob Agents 2012;40:55–60.
30. Bevilacqua S, Demoré B, Boschetti E, et al. 15 years of antibiotic stewardship policy in the Nancy Teaching Hospital. Med Mal Infect 2011;41:532–9.
31. Danaher PJ, Milazzo NA, Kerr KJ, et al. The antibiotic support team--a successful educational approach to antibiotic stewardship. Mil Med 2009;174:201–5.
32. Jenkins TC, Knepper BC, Shihadeh K, et al. Long-term outcomes of an antimicrobial stewardship program implemented in a hospital with low baseline antibiotic use. Infect Control Hosp Epidemiol 2015;36:664–72.
33. Brown KA, Khanafer N, Daneman N, Fisman DN. Meta-analysis of antibiotics and the risk of community-associated Clostridium difficile infection. Antimicrob Agents Chemother 2013;57:2326–32.
34. Deshpande A, Pasupuleti V, Thota P, et al. Community-associated Clostridium difficile infection and antibiotics: a meta-analysis. J Antimicrob Chemother 2013;68:1951–61.
35. Slimings C, Riley TV. Antibiotics and hospital-acquired Clostridium difficile infection: update of systematic review and meta-analysis. J Antimicrob Chemother 2014;69:881–91.
36. Antworth A, Collins CD, Kunapuli A, et al. Impact of an antimicrobial stewardship program comprehensive care bundle on management of candidemia. Pharmacotherapy 2013;33:137–43.
37. Davey P, Brown E, Charani E, et al. Interventions to improve antibiotic prescribing practices for hospital inpatients. Cochrane Database Syst Rev 2013;4:CD003543.
38. Pasquale TR, Trienski TL, Olexia DE, et al. Impact of an antimicrobial stewardship program on patients with acute bacterial skin and skin structure infections. Am J Health Syst Pharm 2014;71:1136–9.
39. Schuts EC, Hulscher ME, Mouton JW, et al. Current evidence on hospital antimicrobial stewardship objectives: a systematic review and meta-analysis. Lancet Infect Dis 2016;16:847–56.
40. Higgins JPT, Green S, editors. Identifying and measuring heterogeneity. Cochrane Handbook for Systematic Reviews of Interventions, version 5.1.0. [Internet]. The Cochrane Collaboration, March 2011. Available at http://handbook.cochrane.org/chapter_9/9_5_2_identifying_and_measuring_heterogeneity.htm.
41. Feazel LM, Malhotra A, Perencevich EN, et al. Effect of antibiotic stewardship programmes on Clostridium difficile incidence: a systematic review and meta-analysis. J Antimicrob Chemother 2014;69:1748–54.
42. Impact of antibiotic stewardship programs on Clostridium difficile (C. diff) infections [Internet]. Centers for Disease Control and Prevention. [Updated 2016 May 13; cited 2016 Oct 11]. Available at www.cdc.gov/getsmart/healthcare/evidence/asp-int-cdiff.html.
43. Burke JP. Antibiotic resistance – squeezing the balloon? JAMA 1998;280:1270–1.
44. This nephrotoxicity result is corrected from the originally published result; communicated by Jan M Prins on behalf of the authors for reference [39]. Prins, JM (Department of Internal Medicine, Division of Infectious Diseases, Academic Medical Centre, Amsterdam, Netherlands). Email communication with Joseph Eckart (Pharmacy Practice & Administration, Ernest Mario School of Pharmacy, Rutgers University, Piscataway, NJ). 2016 Oct 9.
45. Coulter S, Merollini K, Roberts JA, et al. The need for cost-effectiveness analyses of antimicrobial stewardship programmes: a structured review. Int J Antimicrob Agents 2015;46:140–9.
46. Dik J, Vemer P, Friedrich A, et al. Financial evaluations of antibiotic stewardship programs—a systematic review. Frontiers Microbiol 2015;6:317.
47. Campbell KA, Stein S, Looze C, Bosco JA. Antibiotic stewardship in orthopaedic surgery: principles and practice. J Am Acad Orthop Surg 2014;22:772–81.
48. Surveillance for antimicrobial use and antimicrobial resistance options, 2015 [Internet]. Centers for Disease Control and Prevention. [Updated 2016 May 3; cited 2016 Nov 22]. Available at www.cdc.gov/nhsn/acute-care-hospital/aur/index.html.
49. Baysari MT, Lehnbom EC, Li L, Hargreaves A, et al. The effectiveness of information technology to improve antimicrobial prescribing in hospitals: a systematic review and meta-analysis. Int J Med Inform. 2016;92:15-34.
50. Bauer KA, West JE, Balada-llasat JM, et al. An antimicrobial stewardship program’s impact with rapid polymerase chain reaction methicillin-resistant Staphylococcus aureus/S. aureus blood culture test in patients with S. aureus bacteremia. Clin Infect Dis 2010;51:1074–80.
51. Sango A, Mccarter YS, Johnson D, et al. Stewardship approach for optimizing antimicrobial therapy through use of a rapid microarray assay on blood cultures positive for Enterococcus species. J Clin Microbiol 2013;51:4008–11.
52. Perez KK, Olsen RJ, Musick WL, et al. Integrating rapid diagnostics and antimicrobial stewardship improves outcomes in patients with antibiotic-resistant Gram-negative bacteremia. J Infect 2014;69:216–25.
53. Bauer KA, Perez KK, Forrest GN, Goff DA. Review of rapid diagnostic tests used by antimicrobial stewardship programs. Clin Infect Dis 2014;59 Suppl 3:S134–145.
54. Dyar OJ, Pagani L, Pulcini C. Strategies and challenges of antimicrobial stewardship in long-term care facilities. Clin Microbiol Infect 2015;21:10–9.
55. D’Agata EM. Antimicrobial use and stewardship programs among dialysis centers. Semin Dial 2013;26:457–64.
56. Smith MJ, Gerber JS, Hersh AL. Inpatient antimicrobial stewardship in pediatrics: a systematic review. J Pediatric Infect Dis Soc 2015;4:e127–135.
57. Fleet E, Gopal Rao G, Patel B, et al. Impact of implementation of a novel antimicrobial stewardship tool on antibiotic use in nursing homes: a prospective cluster randomized control pilot study. J Antimicrob Chemother 2014;69:2265–73.
58. Drekonja DM, Filice GA, Greer N, et al. Antimicrobial stewardship in outpatient settings: a systematic review. Infect Control Hosp Epidemiol 2015;36:142–52.
59. Drekonja D, Filice G, Greer N, et al. Antimicrobial stewardship programs in outpatient settings: a systematic review. VA-ESP Project #09-009; 2014.
60. Zhang YZ, Singh S. Antibiotic stewardship programmes in intensive care units: why, how, and where are they leading us. World J Crit Care Med 2015;4:13–28. (referenced in online Table)
1. Barlam TF, Cosgrove SE, Abbo AM, et al. Implementing an antimicrobial stewardship program: guidelines by the Infectious Diseases Society of America and the Society of Healthcare Epidemiology of America. Clin Infect Dis 2016;62:e51–77.
2. Hughes D. Selection and evolution of resistance to antimicrobial drugs. IUBMB Life 2014;66:521–9.
3. World Health Organzation. The evolving threat of antimicrobial resistance – options for action. Geneva: WHO Press; 2012.
4. Gould IM, Bal AM. New antibiotic agents in the pipeline and how they can help overcome microbial resistance. Virulence 2013;4:185–91.
5. Davies J, Davies D. Origins and evolution of antibiotic resistance. Microbiol Mol Biol Rev 2010;74:417–33.
6. Owens RC Jr. Antimicrobial stewardship: concepts and strategies in the 21st century. Diagn Microbiol Infect Dis 2008;61:110–28.
7. Antibiotic resistance threats in the United States, 2013 [Internet]. Centers for Disease Control and Prevention. Available at www.cdc.gov/drugresistance/pdf/ar-threats-2013-508.pdf.
8. Nathan C, Cars O. Antibiotic resistance – problems, progress, prospects. N Engl J Med 2014;371:1761–3.
9. McGoldrick, M. Antimicrobial stewardship. Home Healthc Nurse 2014;32:559–60.
10. Ruedy J. A method of determining patterns of use of antibacterial drugs. Can Med Assoc J 1966;95:807–12.
11. Briceland LL, Nightingdale CH, Quintiliani R, et al. Antibiotic streamlining from combination therapy to monotherapy utilizing an interdisciplinary approach. Arch Inter Med 1988;148:2019–22.
12. McGowan JE Jr, Gerding DN. Does antibiotic restriction prevent resistance? New Horiz 1996;4: 370–6.
13. Cappelletty D, Jacobs D. Evaluating the impact of a pharmacist’s absence from an antimicrobial stewardship team. Am J Health Syst Pharm 2013;70:1065–69.
14. Shales DM, Gerding DN, John JF Jr, et al. Society for Healthcare Epidemiology of America and Infectious Diseases Society of America Joint Committee on the prevention of antimicrobial resistance: guidelines for the prevention of antimicrobial resistance in hospitals. Infect Control Hosp Epidemiol 1997;18:275–91.
15. Dellit TH, Owens RC, McGowan JE, et al. Infectious Diseases Society of America and the Society for Healthcare Epidemiology of America guidelines for developing an institutional program to enhance antimicrobial stewardship. Clin Infect Dis 2007;44:159–77.
16. Policy statement on antimicrobial stewardship by the Society for Healthcare Epidemiology of America (SHEA), the Infectious Diseases Society of America (IDSA), and the Pediatric Infectious Diseases Society (PIDS). Infect Ctrl Hosp Epidemiol 2012;33:322–7.
17. The Joint Commission. Approved: New antimicrobial stewardship standard. Joint Commission Perspectives 2016;36:1–8.
18. Pollack LA, Srinivasan A. Core elements of hospital antibiotic stewardship programs from the Centers for Disease Control and Prevention. Clin Infect Dis 2014;59(Suppl 3):S97–100.
19. Moody J. Infection preventionists have a role in accelerating progress toward preventing the emergence and cross-transmission of MDROs. Prevention Strategist 2012 Summer:52–6.
20. Spellberg B, Bartlett JG, Gilbert DN. The future of antibiotics and resistance. N Engl J Med 2013;368:299–302.
21. Olans RN, Olans RD, Demaria A. The critical role of the staff nurse in antimicrobial stewardship--unrecognized, but already there. Clin Infect Dis 2016;62:84–9.
22. Karanika S, Paudel S, Grigoras C, et al. Systematic review and meta-analysis of clinical and economic outcomes from the implementation of hospital-based antimicrobial stewardship programs. Antimicrob Agents Chemother 2016;60:4840–52.
23. Wagner B, Filice GA, Drekonja D, et al. Antimicrobial stewardship programs in inpatient hospital settings: a systematic review. Infect Control Hosp Epidemiol 2014;35:1209–28.
24. Filice G, Drekonja D, Greer N, et al. Antimicrobial stewardship programs in inpatient settings: a systematic review. VA-ESP Project #09-009; 2013.
25. Cairns KA, Doyle JS, Trevillyan JM, et al. The impact of a multidisciplinary antimicrobial stewardship team on the timeliness of antimicrobial therapy in patients with positive blood cultures: a randomized controlled trial. J Antimicrob Chemother 2016;71:3276–83.
26. Hohn A, Heising B, Hertel S, et al. Antibiotic consumption after implementation of a procalcitonin-guided antimicrobial stewardship programme in surgical patients admitted to an intensive care unit: a retrospective before-and-after analysis. Infection 2015;43:405–12.
27. Singh S, Zhang YZ, Chalkley S, et al. A three-point time series study of antibiotic usage on an intensive care unit, following an antibiotic stewardship programme, after an outbreak of multi-resistant Acinetobacter baumannii. Eur J Clin Microbiol Infect Dis 2015;34:1893–900.
28. Cairns KA, Jenney AW, Abbott IJ, et al. Prescribing trends before and after implementation of an antimicrobial stewardship program. Med J Aust 2013;198:262–6.
29. Liew YX, Lee W, Loh JC, et al. Impact of an antimicrobial stewardship programme on patient safety in Singapore General Hospital. Int J Antimicrob Agents 2012;40:55–60.
30. Bevilacqua S, Demoré B, Boschetti E, et al. 15 years of antibiotic stewardship policy in the Nancy Teaching Hospital. Med Mal Infect 2011;41:532–9.
31. Danaher PJ, Milazzo NA, Kerr KJ, et al. The antibiotic support team--a successful educational approach to antibiotic stewardship. Mil Med 2009;174:201–5.
32. Jenkins TC, Knepper BC, Shihadeh K, et al. Long-term outcomes of an antimicrobial stewardship program implemented in a hospital with low baseline antibiotic use. Infect Control Hosp Epidemiol 2015;36:664–72.
33. Brown KA, Khanafer N, Daneman N, Fisman DN. Meta-analysis of antibiotics and the risk of community-associated Clostridium difficile infection. Antimicrob Agents Chemother 2013;57:2326–32.
34. Deshpande A, Pasupuleti V, Thota P, et al. Community-associated Clostridium difficile infection and antibiotics: a meta-analysis. J Antimicrob Chemother 2013;68:1951–61.
35. Slimings C, Riley TV. Antibiotics and hospital-acquired Clostridium difficile infection: update of systematic review and meta-analysis. J Antimicrob Chemother 2014;69:881–91.
36. Antworth A, Collins CD, Kunapuli A, et al. Impact of an antimicrobial stewardship program comprehensive care bundle on management of candidemia. Pharmacotherapy 2013;33:137–43.
37. Davey P, Brown E, Charani E, et al. Interventions to improve antibiotic prescribing practices for hospital inpatients. Cochrane Database Syst Rev 2013;4:CD003543.
38. Pasquale TR, Trienski TL, Olexia DE, et al. Impact of an antimicrobial stewardship program on patients with acute bacterial skin and skin structure infections. Am J Health Syst Pharm 2014;71:1136–9.
39. Schuts EC, Hulscher ME, Mouton JW, et al. Current evidence on hospital antimicrobial stewardship objectives: a systematic review and meta-analysis. Lancet Infect Dis 2016;16:847–56.
40. Higgins JPT, Green S, editors. Identifying and measuring heterogeneity. Cochrane Handbook for Systematic Reviews of Interventions, version 5.1.0. [Internet]. The Cochrane Collaboration, March 2011. Available at http://handbook.cochrane.org/chapter_9/9_5_2_identifying_and_measuring_heterogeneity.htm.
41. Feazel LM, Malhotra A, Perencevich EN, et al. Effect of antibiotic stewardship programmes on Clostridium difficile incidence: a systematic review and meta-analysis. J Antimicrob Chemother 2014;69:1748–54.
42. Impact of antibiotic stewardship programs on Clostridium difficile (C. diff) infections [Internet]. Centers for Disease Control and Prevention. [Updated 2016 May 13; cited 2016 Oct 11]. Available at www.cdc.gov/getsmart/healthcare/evidence/asp-int-cdiff.html.
43. Burke JP. Antibiotic resistance – squeezing the balloon? JAMA 1998;280:1270–1.
44. This nephrotoxicity result is corrected from the originally published result; communicated by Jan M Prins on behalf of the authors for reference [39]. Prins, JM (Department of Internal Medicine, Division of Infectious Diseases, Academic Medical Centre, Amsterdam, Netherlands). Email communication with Joseph Eckart (Pharmacy Practice & Administration, Ernest Mario School of Pharmacy, Rutgers University, Piscataway, NJ). 2016 Oct 9.
45. Coulter S, Merollini K, Roberts JA, et al. The need for cost-effectiveness analyses of antimicrobial stewardship programmes: a structured review. Int J Antimicrob Agents 2015;46:140–9.
46. Dik J, Vemer P, Friedrich A, et al. Financial evaluations of antibiotic stewardship programs—a systematic review. Frontiers Microbiol 2015;6:317.
47. Campbell KA, Stein S, Looze C, Bosco JA. Antibiotic stewardship in orthopaedic surgery: principles and practice. J Am Acad Orthop Surg 2014;22:772–81.
48. Surveillance for antimicrobial use and antimicrobial resistance options, 2015 [Internet]. Centers for Disease Control and Prevention. [Updated 2016 May 3; cited 2016 Nov 22]. Available at www.cdc.gov/nhsn/acute-care-hospital/aur/index.html.
49. Baysari MT, Lehnbom EC, Li L, Hargreaves A, et al. The effectiveness of information technology to improve antimicrobial prescribing in hospitals: a systematic review and meta-analysis. Int J Med Inform. 2016;92:15-34.
50. Bauer KA, West JE, Balada-llasat JM, et al. An antimicrobial stewardship program’s impact with rapid polymerase chain reaction methicillin-resistant Staphylococcus aureus/S. aureus blood culture test in patients with S. aureus bacteremia. Clin Infect Dis 2010;51:1074–80.
51. Sango A, Mccarter YS, Johnson D, et al. Stewardship approach for optimizing antimicrobial therapy through use of a rapid microarray assay on blood cultures positive for Enterococcus species. J Clin Microbiol 2013;51:4008–11.
52. Perez KK, Olsen RJ, Musick WL, et al. Integrating rapid diagnostics and antimicrobial stewardship improves outcomes in patients with antibiotic-resistant Gram-negative bacteremia. J Infect 2014;69:216–25.
53. Bauer KA, Perez KK, Forrest GN, Goff DA. Review of rapid diagnostic tests used by antimicrobial stewardship programs. Clin Infect Dis 2014;59 Suppl 3:S134–145.
54. Dyar OJ, Pagani L, Pulcini C. Strategies and challenges of antimicrobial stewardship in long-term care facilities. Clin Microbiol Infect 2015;21:10–9.
55. D’Agata EM. Antimicrobial use and stewardship programs among dialysis centers. Semin Dial 2013;26:457–64.
56. Smith MJ, Gerber JS, Hersh AL. Inpatient antimicrobial stewardship in pediatrics: a systematic review. J Pediatric Infect Dis Soc 2015;4:e127–135.
57. Fleet E, Gopal Rao G, Patel B, et al. Impact of implementation of a novel antimicrobial stewardship tool on antibiotic use in nursing homes: a prospective cluster randomized control pilot study. J Antimicrob Chemother 2014;69:2265–73.
58. Drekonja DM, Filice GA, Greer N, et al. Antimicrobial stewardship in outpatient settings: a systematic review. Infect Control Hosp Epidemiol 2015;36:142–52.
59. Drekonja D, Filice G, Greer N, et al. Antimicrobial stewardship programs in outpatient settings: a systematic review. VA-ESP Project #09-009; 2014.
60. Zhang YZ, Singh S. Antibiotic stewardship programmes in intensive care units: why, how, and where are they leading us. World J Crit Care Med 2015;4:13–28. (referenced in online Table)
Selecting a Direct Oral Anticoagulant for the Geriatric Patient with Nonvalvular Atrial Fibrillation
From the Ernest Mario School of Pharmacy, Rutgers University, Piscataway, NJ.
Abstract
- Objective: To provide a clinical summary of the available data evaluating the use of direct oral anticoagulants (DOACs) in geriatric patients with nonvalvular atrial fibrillation.
- Methods: MEDLINE, Web of Science, and Google Scholar were used to identify pertinent systematic reviews, randomized controlled trials, observational studies, and pharmacokinetic studies evaluating use of DOACs in the geriatric population.
- Results: A total of 8 systemic reviews, 5 randomized controlled trials, 2 observational trials, and 5 pharmacokinetic studies of relevance were identified for inclusion in this review. The landscape of anticoagulation has dramatically changed over the past 5 years beginning with the development and marketing of an oral direct thrombin inhibitor and followed by 3 oral direct factor Xa inhibitors. Despite significant advances in this oral anticoagulation arena, many questions remain as to the best therapeutic approach in the geriatric population as the literature is lacking. This population has a higher risk of stroke; however, due to the increased risk of bleeding clinicians may often defer anticoagulant therapy due to the fear of hemorrhagic complications. Clinicians must consider the risk-benefit ratio and the associated outcomes in geriatric patients compared to other patient populations.
- Conclusions: Interpreting the available literature and understanding the benefits and limitations of the DOACs is critical when selecting the most appropriate pharmacologic strategy in geriatric patients.
Anticoagulants are among the top 5 drug classes associated with patient harm in the US [1] and are commonly reported as contributing to hospitalizations [2]. In just one quarter in 2012 alone, warfarin, dabigatran, and rivaroxaban accounted for 1734 of 50,289 adverse events reported to the Food and Drug Administration (FDA), including 233 deaths [3]. Appropriate use of anticoagulant agents and consideration of individual patient risk factors are essential to mitigate the occurrence of adverse consequences, especially in the geriatric population. This population is more likely to have risk factors for adverse drug events, for example, polypharmacy, age-related changes in pharmacokinetics and pharmacodynamics, and diminished organ function (ie, renal and hepatic) [4,5]. Another important consideration is the lack of consensus on the definition of a “geriatric” or “elderly” patient. Although many consider a chronological age of > 65 years as the defining variable for a geriatric individual, this definition does not account for overall health status [6,7]. Clinicians should consider this shortcoming when evaluating the quality of geriatric studies. For example, a study claiming to evaluate the pharmacokinetics of a drug in a geriatric population enrolling healthy subjects aged > 65 years may result in data that do not translate to clinical practice.
Compounding the concern for iatrogenic events is the frequency of anticoagulant use in the geriatric population, as several indications are found more commonly in this age-group. Stroke prevention in nonvalvular atrial fibrillation (AF), the most common arrhythmia in the elderly, is a common indication for long-term anticoagulation [8]. The prevalence of AF increases with age and is usually higher in men than in women [9,10]. AF is generally uncommon before 60 years of age, but the prevalence increases noticeably thereafter, affecting approximately 10% of the overall population by 80 years of age [11]. The median age of patients who have AF is 75 years with approximately 70% of patients between 65 and 85 years of age [8,12]. Currently in the United States, an estimated 2.3 million people are diagnosed with AF [8]. In 2020, the AF population is predicted to increase to 7.5 million individuals with an expected prevalence of 13.5% among individuals ≥ 75 years of age, and 18.2% for those ≥ 85 years of age [13]. These data underscore the importance of considering the influence of age on the balance between efficacy and safety of anticoagulant therapy.
Direct oral anticoagulants (DOACs) represent the first alternatives to warfarin in over 6 decades. Currently available products in US include apixaban, dabigatran, edoxaban, and rivaroxaban. DOACs possess many of the characteristics of an ideal anticoagulant, including predictable pharmacokinetics, a wider therapeutic window compared to warfarin, minimal drug interactions, a fixed dose, and no need for routine evaluation of coagulation parameters. The safety and efficacy of the DOACs for stroke prevention in nonvalvular AF have been substantiated in several landmark clinical trials [14–16]. Yet there are several important questions that need to be addressed, such as management of excessive anticoagulation, clinical outcome data with renally adjusted doses (an exclusion criterion in many landmark studies was a creatinine clearance of < 25–30 mL/min), whether monitoring of coagulation parameters could enhance efficacy and safety, and optimal dosing strategies in geriatric patients. This review provides clinicians a summary of data from landmark studies, post-marketing surveillance, and pharmacokinetic evaluations to support DOAC selection in the geriatric population.
Evaluating Bleeding Risk
These tools have been extensively evaluated with warfarin therapy, but their performance in predicting DOAC-related bleeding has not been definitively established. Nonetheless, until tools evaluated specifically for DOACs are developed, it is reasonable to use these for risk-prediction in combination with clinical judgment. As an example, the European Society of Cardiology guideline on the use of non–vitamin K antagonist (VKA) anticoagulants in patients with nonvalvular AF suggests that the HAS-BLED score may be used to identify risk factors for bleeding and correct those that are modifiable [20]. The HAS-BLED score is validated for VKA and non-VKA anticoagulants (early-generation oral direct thrombin inhibitor ximelgatran) [21] and is the only bleeding risk score predictive for intracranial hemorrhage [19]. In a 2013 “real world” comparison, HAS-BLED was easier to use and had better predictive accuracy that ATRIA [22].
One of the major challenges in geriatric patients is that those at highest risk for bleeding are those who would have the greatest benefit from anticoagulation [23]. The prediction scores can help clinicians balance the risk-benefit ratio for anticoagulation on a case by case basis. Although the scoring systems take into consideration several factors, including medical conditions that have been shown to significantly increase bleeding risk, including hypertension, cerebrovascular disease, ischemic stroke, serious heart disease, diabetes, renal insufficiency, alcoholism and liver disease, not all are included in every scoring scheme [23]. These conditions are more common among elderly patients, and this should be taken into account when estimating the risk-benefit ratio of oral anticoagulation [15]. Patients’ preferences should also be taken into account. It is essential for clinicians to clearly discuss treatment options with patients as data suggest that clinician and patient perceptions of anticoagulation are often mismatched [24–26].
Performance of TSOACS in Landmark Studies
Some specific differences in outcomes seen in landmark studies that may facilitate selection among the DOACs include the risk of major bleeding, risk of gastrointestinal bleeding, risk of acute coronary syndrome, exclusion of valvular heart disease, and noninferiority versus superiority as the primary endpoint when compared to warfarin.
Major Bleeding
Gastrointestinal Bleeding
Among all of the DOACs, gastrointestinal (GI) bleeding was significantly greater with dabigatran, edoxaban, and rivaroxaban when compared to warfarin (HR, 1.49; 95% CI, 1.21–1.84; HR, 1.23; 95% CI, 1.02–1.50; and HR, 1.61; 95% CI, 1.30–1.99, respectively; P < 0.05 for all) [14–16] in landmark studies. Based on these data, clinicians may consider the selection of apixaban in patients with a previous history of GI pathology. GI bleeding may be more common in elderly patients due to the potential for preexisting GI pathology and high local concentrations of drug [29]. Clemens and colleagues suggested an “anticoagulation GI stress test” may predict GI malignancy [33]. They found that patients on DOACs that presented with a GI bleed were more likely to present with a GI malignancy. As such, it is reasonable to screen patients with a fecal occult blood test within the first month after initiating TSOAC treatment and then annually.
Acute Coronary Syndrome
A higher rate of myocardial infarction was observed with dabigatran 150 mg versus warfarin (0.74% vs 0.53% per year; P = 0.048) in the RE-LY study [16]. Whether the increase in myocardial infarction was due to dabigatran as a causative agent or warfarin’s ability to reduce the risk of myocardial infarction to a larger extent compared with dabigatran is unknown. Nonetheless, it may be prudent to use an alternative therapy in patients with a history of acute coronary syndrome.
Valvular Heart Disease
The risk of stroke and systemic embolism is higher in patients with valvular heart disease [34]. Patients with moderate to severe mitral stenosis or mechanical prosthetic heart valves were excluded from the DOAC landmark studies. Dabigatran was evaluated for prevention of stroke and systemic embolism in patients with valvular heart disease in the RE-ALIGN study [35,36]. Patients were randomized to warfarin titrated to a target INR of 2 to 3 or 2.5 to 3.5 on the basis of thromboembolic risk or dabigatran 150 mg, 220 mg, or 300 mg twice daily adjusted to a targeted trough of ≥ 50 ng/mL. The trial was terminated early due to a worse primary outcome (composite of stroke, systemic embolism, myocardial infarction, and death) with dabigatran versus warfarin (HR, 3.37, 95% CI, 0.76–14.95; P = 0.11). In addition, bleeding rates (any bleeding) was significantly greater with dabigatran (27%) versus warfarin (12%) (P = 0.01). Based on these data and the lack of data with the other TSOACs, warfarin remains the standard of care for valvular heart disease [37]. In patients with a previous bioprosthetic valve with AF, patients with mitral insufficiency, or aortic stensosis, TSOACs may be considered [37].
Landmark Study Efficacy Endpoints
The primary endpoint in each of the landmark studies was a composite of stroke (ischemic or hemorrhagic) and systemic embolism. For the primary endpoint only dabigatran 150 mg twice daily and apixaban 5 mg twice daily were found to be superior to warfarin for the prevention of stroke or systemic embolism in nonvalvular AF (HR, 0.66; 95% CI, 0.53–0.82; P < 0.001 and HR, 0.66; 95% CI, 0.66–0.95; P = 0.01, respectively). Both edoxaban (60 mg and 30 mg daily) and rivaroxaban were noninferior to warfarin for the primary endpoint. In terms of ischemic stroke, only dabigatran 150 mg twice daily was superior to warfarin for the reduction in ischemic stroke in patients with nonvalvular AF (HR, 0.76; 95% CI, 0.60–0.98; P = 0.03) [19]. All of the DOACs demonstrated a reduction in hemorrhagic stroke.
TSOAC Use in Elderly Patitents
Pharmacokinetic Evaluations
Several pharmacokinetic studies have evaluated the influence of age on DOAC disposition. In a study evaluating the influence of age on apixaban disposition, the area under the concentration-time curve to infinity was 32% higher in the elderly (aged 65 years or older) compared to the younger subjects (< age 40 years) [38]. These data provide the rationale for dosage adjustment in individuals aged 80 years or older with either low body mass (weight less than or equal to 60 kg) or renal impairment (serum creatinine 1.5 mg/dL or higher). In a pharmacokinetic study evaluating dabigatran in patients > 65 years of age, the time to steady state ranged from 2 to 3 days, correlating to a half-life of 12 to 14 hours, and peak concentrations (256 ng/mL females, 255 ng/mL males) were reached after a median of 3 hours (range, 2.0–4.0 hours) [39]. These data suggest a 1.7- to twofold increase in bioavailability. The area under the curve of rivaroxaban was significantly higher in subjects > 75 years versus subjects 18-45 years, while total and renal clearance were decreased [40].However, the time to maximum factor Xa inhibition and Cmax were not influenced by age.
Clinical Evaluations
Dabigatran
In a post-hoc analysis of the RE-LY trial, Eikelboom and colleagues found that patients 75 years of age and older treated with dabigatran 150 mg twice daily had a greater incidence of GI bleeding irrespective of renal function compared with those on warfarin (1.85%/year vs. 1.25%/year; P < 0.001) [29]. A higher risk in major bleeding also was seen in dabigatran patients (5.10% versus 4.37%; P = 0.07). As a result, the 2012 Beer’s Criteria lists dabigatran as a potentially inappropriate medication. An analysis was conducted of 134,414 elderly Medicare patients (defined as age > 65 years) with 37,587 person-years of follow-up who were treated with dabigatran or warfarin [44]. Approximately 60% of patients included in the analysis were over age 75 years. Dabigatran was associated with a significant reduction in ischemic stroke: HR 0.80 (CI 0.67–0.96); intracranial hemorrhage: HR 0.34 (CI 0.26–0.46); and death: HR 0.86 (CI 0.77–0.96) when compared with warfarin. As in the Eikelboom study, major gastrointestinal bleeding was significantly increased with dabigatran (HR, 1.28 [95% CI, 1.14–1.44]).
Rivaroxaban
For rivaroxaban, a subgroup analysis of patients ≥ 75 years in the ROCKET-AF trial reported similar rates of major bleeding (HR, 1.11; 95% CI, 0.92–1.34) with rivaroxaban compared with warfarin [31]. Clinically relevant non-major bleeding was significantly higher for patients aged ≥ 75 years compared with patients aged < 75 years (P = 0.01).
Apixaban
Halvorsen and colleagues found that age did not influence the benefits of apixaban in terms of efficacy and safety [47]. In the cohort of patients aged 75 years or older, major bleeding was significantly reduced compared to warfarin (HR, 0.64; 95% CI, 0.52–0.79). The safety benefits persisted even in the setting of age greater than 75 years and renal impairment. A significant reduction in major bleeding (HR, 0.35; 95% CI, 0.14–0.86) was seen in elderly patients with a CrCl; ≤ 30 mL/min (n = 221) treated with apixaban versus warfarin. Similarly, in elderly patients with a CrCl 30 to 50 mL/min (n = 1898) a significant reduction in major bleeding was reported (HR, 0.53; 95% CI, 0.37–0.76). These data are consistent with a meta-regression analysis that found a linear relationship between the relative risk of major bleeding and the magnitude of renal excretion for the DOACs (r2=0.66, P = 0.03) [48]. In this analysis, apixaban had the most favorable outcomes in terms of major bleeding compared to the other DOACs and also has the least dependence on renal function for clearance. In a pooled analysis of data from landmark trials, Ng and colleagues found that in elderly patients (defined as age > 75 years) with nonvalvular AF, only apixaban was associated with a significant reduction in both stroke and major hemorrhage (Figure 1) [49,50].
Edoxaban
Kato and colleagues performed a subgroup analysis of patients aged 75 years or older enrolled in the ENGAGE TIMI 48 study [50]. Currently the results are only published in abstract form. Regardless of treatment, the risk of major bleeding and stroke significantly increased with age (P < 0.001). An absolute risk reduction in major bleeding was reported with both 60 mg and 30 mg of edoxaban versus warfarin (4.0%/year and 2.2%/year versus 4.8%/year, respectively; no P value provided).
Therapeuti Drug Monitoring
Collectively, the data on assessment of the anticoagulant activity of DOACs using coagulation assays is evolving. These tests include but are not limited to prothrombin time (PT), activated partial thromboplastin time (aPTT), thrombin clotting time (TT), dilute TT, activated clotting time (ACT), anti factor Xa, and ecarin clotting time (ECT) assays. Although routine monitoring is not desirable, the ability to assess degree of anticoagulation in select patient populations may prove beneficial. Future studies are essential to confirm whether assessing DOAC activity using coagulation assays in vulnerable populations such as the elderly improves clinical outcomes. Several reviews on this subject matter have been published [51–55]. The reader is encouraged to review these data as there are significant limitations to currently available assays and incorrect interpretation may lead to suboptimal treatment decisions.
Renal and Hepatic Dysfunction
Depending on the specific agents, DOACs renal clearance varies from 27% to 80% [56–59]. Clinical trials often use the Cockcroft-Gault formula (CG) based on actual body weight to estimate renal function. Landmark trials evaluating the DOACs differed in their strategy for estimation of renal function using CG. For example, RE-LY and ROCKET-AF used actual body weight for the estimation of renal function, while ARISTOTLE did not specify which body weight to use. Estimation of renal function or glomerular filtration rate (GFR) by CG is frequently in discordance with actual renal function in the elderly [60]. MDRD (modification of diet in renal disease) and Chronic Kidney Disease-Epidemiology Collaboration (CKD-EPI) are also common estimations that provide an estimate of GFR. In a cross-sectional study, comparing the CG, MDRD, and EPI formulas in a clinical setting, data from potential kidney donors and adult patients who underwent a GFR measurement revealed that MDRD has the smallest mean bias. The influence of age was the absolute bias for estimation of renal function for all formulas. CG is additionally influenced by body weight and body mass index. When compared to CG, MDRD actually reported more accurate predictor of GFR in adults < 70 years old [61]. However, package inserts recommend dose adjustments based on estimation of CrCl using CG formula. This poses a problem in adjusting DOAC doses in elderly patients who are subject to overestimation of renal function with this antiquated equation. Among elderly patients with renal impairment, discordance between estimated and actual renal function was higher for dabigatran and rivaroxaban than for apixaban dosages [61].
Renal excretion of unchanged dabigatran is the predominant pathway for elimination (~80%) [58]. The FDA-approved dosing strategy in the US for dabigatran is 150 mg twice daily in patients with a CrCl ≥ 30 mL/min, 75 mg twice daily in patients with severe renal impairment (CrCl 15–30 mL/min), and is contraindicated in patients with a CrCl < 15 mL/min [58]. By comparison, the Canadian and the European Medicines Agency have listed patients with a CrCl < 30 mL/min (severe renal impairment) as a contraindication for use. The US-approved dosage for severe renal impairment was derived during the approval phase of dabigatran using a simulation pharmacokinetic model [62,63]. The dosage was estimated by pharmacokinetic simulation to provide similar Cmax and Cmin concentrations compared to the 150 mg twice-daily dosage in moderate renal impairment. Compared to patients with CrCl ≥ 80 mL/min, there was a 1.29- and a 1.47-fold increase in dabigatran trough plasma concentration in the CrCl 50–80 mL/min patients and the CrCl 30–50 mL/min patients, respectively. There have been many postmarketing reports of hemorrhage with dabigatran [36,84,85]. Although reporting bias is likely due to the novelty of the agent, clinicians may take key clinical pearls away from these reports. Patients often had risk factors, including low body weight, renal impairment, and polypharmacy with interacting drugs (eg, amiodarone). These risk factors are also important with the other DOACs.
A subgroup analysis of ROCKET-AF evaluating rivaroxaban 15 mg daily in patients with a CrCl of 30–49 mL/min did not identify any differences in endpoints with the exception of fatal bleeding, which occurred less often with rivaroxaban (0.28%/yr vs. 0.74%/yr; P = 0.047) [64].
Monitoring of renal function is essential to mitigate the risk of drug accumulation. Clinicians should consider obtaining a baseline renal assessment with annual reassessments in patients with normal (CrCl ≥ 80 mL/min) or mild (CrCl 50–79 mL/min) renal impairment, and 2 to 3 times per year in patients with moderate (CrCl 30–49 mL/min) renal impairment [65]. A summary of renal dose adjustments for DOAC therapy may be found in Table 5 [56–59].
In addition to renal function, hepatic impairment can also affect the metabolism of anticoagulants. Severe hepatic impairment can lead to prolonged PT. Therefore, patients who have liver dysfunction and are treated with anticoagulation have increased risk of hemorrhagic events. Large pivotal trials on the key indications of dabigatran, apixaban, and rivaroxaban excluded patients with significant signs of hepatic impairment. Table 5 provides dosing recommendations for the different DOACs in the setting of hepatic impairment [56–59].
Polypharmacy And The Potential For Adverse Consequences
Costs And Cost-Effectiveness of DOACS
With the high burden of AF and the aging population, analysis of cost and value is an important consideration [76]. There are limited publications comparing the cost-effectiveness between the anticoagulation options. However, numerous cost-effectiveness studies have evaluated the individual DOACs [71–79]. Overall, the studies suggest that the DOACs are a cost-effective alternative to warfarin in the general and elderly populations. One analysis reported that dabigatran may not be cost-effective in patients with a low CHADS2 score (≤ 2) [71].
Harrington et al [80] compared the cost-effectiveness of dabigatran, rivaroxaban, and apixaban versus warfarin. This cost-effectiveness study used published clinical trial data to build a decision model, and results indicated that for patients ≥ 70 years of age with an increased risk for stroke, normal renal function, and no previous contraindications to anticoagulant therapy, apixaban 5 mg, dabigatran 150 mg, and rivaroxaban 20 mg were cost-effective substitutes for warfarin for the prevention of stroke in nonvalvular AF [80]. Apixaban was the preferred anticoagulant for their hypothetical cohort of 70-year-old patients with nonvalvular AF, as it was most likely to be the cost-effective treatment option at all willing-to-pay thresholds > $40,000 per quality-adjusted life-year gained [76,81].
Prescription costs may vary depending on payor and level of insurance. If a patient does not have prescription insurance, the annual price of generic warfarin is roughly $200 to $360, depending on dosage. Approximate annual costs for the DOACs are greater than 20 times the cost of warfarin (apixaban $4500, dabigatran $4500, and rivaroxaban $4800) [82]. However, most patients on these medications are over 65 years old and have prescription coverage through Medicare Part D. Of note, patients may have more of a burden if or when they reach the “donut hole” coverage gap. Currently, once patients spend $2960 (for 2015) and $3310 (for 2016) on covered drugs they will fall into the donut hole unless they qualify for additional assistance. At this point Medicare Part D will reimburse 45% of the cost of the newer anticoagulants since generics are currently unavailable. As a result, individual affordability may become an issue. Further complicating the scenario is the inability to apply coupon and rebate cards in the setting of government-funded prescription coverage. Clinicians should discuss these issues with their patients to help select the most valuable therapy.
Conclusions And Recommendations
Corresponding author: Luigi Brunetti, PharmD, MPH, Rutgers University, 160 Frelinghuysen Rd, Piscataway, NJ 08854, [email protected].
1. Fanikos J, Stapinski C, Koo S, et al. Medication errors associated with anticoagulant therapy in the hospital. Am J Cardiol 2004;94:532–5.
2. Budnitz DS, Lovegrove MC, Shehab N, Richards CL. Emergency hospitalizations for adverse drug events in older Americans. N Engl J Med2011;365:2002–12.
3. Institute for Safe Medication Practices. QuarterWatch. 9 January 2013. Available at http://www.ismp.org/quarterwatch/pdfs/2012Q2.pdf.
4. Hajjar ER, Hanlon JT, Artz MB, Let al. Adverse drug reaction risk factors in older outpatients. Am J Geriatr Pharmacother 2003;1:82–9.
5. Gurwitz JH, Field TS, Harrold LR, et al. Incidence and preventability of adverse drug events among older persons in the ambulatory setting. JAMA 2003;289:1107–16.
6. Singh S. Defining ‘elderly’ in clinical practice guidelines for pharmacotherapy. Pharm Pract 2014;12:489.
7. Singh S, Bajorek B. Pharmacotherapy in the aging patient: The impact of age per se (a review). Ageing Res Rev 2015 Jul 28. pii: S1568-1637(15)30008-8.
8. Go AS, Hylek EM, Phillips KA, et al. Prevalence of diagnosed atrial fibrillation in adults: national implications for rhythm management and stroke prevention: the anticoagulation and risk factors in atrial fibrillation (ATRIA) Study. JAMA 2001;285:2370–5.
9. Lip GY, Brechin CM, Lane DA. The global burden of atrial fibrillation and stroke: a systematic review of the epidemiology of atrial fibrillation in regions outside North America and Europe Chest 2012;142:1489–98.
10. Camm AJ, Lip GY, De Caterina R, et al. 2012 Focused update of the ESC guidelines for the management of atrial fibrillation: an update of the 2010 ESC guidelines for the management of atrial fibrillation-developed with the special contribution of the European Heart Rhythm Association Europace 2012;14:1385–413.
11. Kannel WB, Benjamin EJ. Status of the epidemiology of atrial fibrillation. Med Clin North Am 2008;92:17–40.
12. Chugh SS, Havmoeller R, Narayanan K, et al. Worldwide epidemiology of atrial fibrillation: a global burden 2010 study. Circulation 2014;129:837-47.
13. Miyasaka Y, Barnes ME, Gersh BJ, et al. Secular trends in incidence of atrial fibrillation in Olmsted County, Minnesota, 1980 to 2000, and implications on the projections for future prevalence. Circulation 2006;114:119–25.
14. Connolly SJ, Ezekowitz MD, Yusuf S, et al. Dabigatran versus warfarin in patients with atrial fibrillation. N Engl J Med 2009;361:1139–51.
15. Patel MR, Mahaffey KW, Garg J, et al. Rivaroxaban versus warfarin in nonvalvular atrial fibrillation. N Engl J Med 2011;365:883–91.
16. Granger CB, Alexander JH, McMurray JJ, et al. Apixaban versus warfarin in patients with atrial fibrillation. N Engl J Med 2011;365:981–92.
17. Apostolakis S, Lane DA, Guo Y, et al. Performance of the HEMORR2HAGES, ATRIA, and HAS-BLED Bleeding Risk–Prediction Scores in Patients With Atrial Fibrillation Undergoing Anticoagulation: The AMADEUS (Evaluating the Use of SR34006 Compared to Warfarin or Acenocoumarol in Patients With Atrial Fibrillation) Study. J Am Coll Cardiol 2012;60:861–7.
18. Fang MC, Go AS, Chang Y, et al. A new risk scheme to predict warfarin-associated hemorrhage: the ATRIA (Anticoagulation and Risk Factors in Atrial Fibrillation) study. J Am Coll Cardiol 2011;58:395–401.
19. Pisters R, Lane DA, Nieuwlaat R, et al. A novel user-friendly score (has-bled) to assess 1-year risk of major bleeding in patients with atrial fibrillation: The Euro Heart Survey. Chest 2010;138:1093–100.
20. Heidbuchel H, Verhamme P, Alings M, et al. Updated European Heart Rhythm Association Practical Guide on the use of non-vitamin K antagonist anticoagulants in patients with non-valvular atrial fibrillation. Europace 2015;17:1467–507.
21. Lip GY, Frison L, Halperin JL, Lane DA. Comparative validation of a novel risk score for predicting bleeding risk in anticoagulated patients with atrial fibrillation: the HAS-BLED (Hypertension, Abnormal Renal/Liver Function, Stroke, Bleeding History or Predisposition, Labile INR, Elderly, Drugs/Alcohol Concomitantly) score. J Am Coll Cardiol 2011;57:173–80.
22. Roldán V, Marín F, Fernández H, et al. Predictive value of the HAS-BLED and ATRIA bleeding scores for the risk of serious bleeding in a "real-world" population with atrial fibrillation receiving anticoagulant therapy. Chest 2013;43:179–84.
23. Robert-Ebadi H, Le Gal G, Righini M. Use of anticoagulants in elderly patients: practical recommendations. Clin Interv Aging 2009;4:165–77.
24. Barcellona D, Contu P, Sorano GG, et al. The management of oral anticoagulant therapy: the patient's point of view. Thromb Haemost 2000;83:49–53.
25. Lancaster TR, Singer DE, Sheehan MA, et al. The impact of long-term warfarin therapy on quality of life. Evidence from a randomized trial. Boston Area Anticoagulation Trial for Atrial Fibrillation Investigators. Arch Intern Med 1991;151:1944–9.
26. Devereaux PJ, Anderson DR, Gardner MJ, et al. Differences between perspectives of physicians and patients on anticoagulation in patients with atrial fibrillation: observational study. BMJ 2001;323:1218–22.
27. Giugliano RP, Ruff CT, Braunwald E, Murphy SA. Edoxaban versus warfarin in patients with atrial fibrillation. N Engl J Med 2013;369:2093–104.
28. Barco S, Cheung YW, Eikelboom JW, Coppens M. New oral anticoagulants in elderly patients. Best Pract Res Clin Haematol 2013;26:215–24
29. Eikelboom JW, Wallentin L, Connolly SJ, et al. Risk of bleeding with 2 doses of dabigatran compared with warfarin in older and younger patients with atrial fibrillation: an analysis of the randomized evaluation of long-term anticoagulant therapy (RE-LY) trial. Circulation 2011;123:2363–72.
30. Coppens M, Eikelboom JW, Ezekowitz M, et al. Dabigatran versus warfarin in very elderly patients with atrial fibrillation: results from the RE-LY trial. Abstract. Circulation 2012;126:A15l537.
31. Halperin JL, Wojdyla D, Piccini JP, et al. Efficacy and safety of rivaroxaban compared with warfarin among elderly patients with nonvalvular atrial fibrillation in the ROCKET-AF trial. Abstract. Stroke 2012;43:A148.
32. Ruff CT, Giugliano RP, Braunwald E, et al. Comparison of the efficacy and safety of new oral anticoagulants with warfarin in patients with atrial fibrillation: a meta-analysis of randomised trials. Lancet 2014;383:955–62
33. Clemens A, Strack A, Noack H, et al. Anticoagulant-related gastrointestinal bleeding—could this facilitate early detection of benign or malignant gastrointestinal lesions? Ann Med 2014;46:672–8.
34. Petty GW, Khandheria BK, Whisnant JP, et al. Predictors of cerebrovascular events and death among patients with valvular heart disease: A population-based study. Stroke 2000;31:2628–35.
35. Eikelboom JW, Connolly SJ, Brueckmann M, et al. Dabigatran versus warfarin in patients with mechanical heart valves. N Engl J Med 2013;369:1206–14.
36. Schomburg JL, Medina EM, Lahti MT, Bianco RW. Dabigatran versus warfarin after mechanical mitral valve replacement in the swine model. J Invest Surg 2012;25:150–5.
37. Douketis J, Bell AD, Eikelboom J, Liew A. Approach to the new oral anticoagulants in family practice: part 2: addressing frequently asked questions. Can Fam Physician 2014;60:997–1001.
38. Frost CE, Nepal S, Barrett YC, LaCreta F. Effects of age and gender on the singledose pharmacokinetics (PK) and pharmacodynamics (PD) of apixaban. Abstract. J Thromb Haemost 2009;7(Suppl 2):PP-MO-407..
39. Stangier J, Stahle H, Rathgen K et al. Pharmacokinetics and pharmacodynamics of the direct oral thrombin inhibitor dabigatran in healthy elderly subjects. Clin Pharmacokinet 2008;47:47–59.
40. Kubitza D, Becka M, Mueck W. The effect of extreme age, and gender on the pharmacology and tolerability of rivaroxaban, an oral direct factor Xa inhibitor. Blood 2006;108: Abstract 905.
41. Siegal DM, Crowther MA. Acute management of bleeding in patients on novel oral anticoagulants. Eur Heart J 2013;34:489–98.
42. Evans A, Kalra L. Are the results of randomized controlled trials on anticoagulation in patients with atrial fibrillation generalizable to clinical practice? Arch Intern Med 2001;161:1443–7.
43. Harper P, Young L, Merriman E. Bleeding risk with dabigatran in the frail elderly. N Engl J Med 2012;366:864–6.
44. Graham DJ, Reichman ME, Wernecke M, et al. Cardiovascular, bleeding, and mortality risks in elderly Medicare patients treated with dabigatran or warfarin for nonvalvular atrial fibrillation. Circulation 2015;131:157–64.
45. Avgil-Tsadok M, Jackevicius CA, Essebag V, et al. Dabigatran use in elderly patients with atrial fibrillation. Thromb Haemost 2015;115(1).
46. Uchino K, Hernandez AV. Dabigatran association with higher risk of acute coronary events: meta-analysis of noninferiority randomized controlled trials. Arch Intern Med 2012;172:397–402.
47. Halvorsen S, Atar D, Yang H, et al. Efficacy and safety of apixaban compared with warfarin according to age for stroke prevention in atrial fibrillation: observations from the ARISTOTLE trial. Eur Heart J 2014;35:1864–72.
48. Lega JC, Bertoletti L, Gremillet C, et al. Consistency of safety profile of new oral anticoagulants in patients with renal failure. J Thromb Haemost 2014;12:337–43.
49. Ng KH, Hart RG, Eikelboom JW. Anticoagulation in patients aged ≥ 75 years with atrial fibrillation: role of novel oral anticoagulants. Cardiol Ther 2013;2:135–49.
50. Kato ET, Guigliano RP, Ruff CT, et al. Efficacy and safety of edoxaban for the management of elderly patients with atrial fibrillation: Engage-AF TIMI 48. Circulation 2014;130:A16612.
51. Tripodi A. The laboratory and the new oral anticoagulants. Clin Chem 2013;59:353–62.
52. Tripodi A, Di Iorio G, Lippi G, et al. Position paper on laboratory testing for patients taking new oral anticoagulants. Consensus document of FCSA, SIMeL, SIBioC and CISMEL. Clin Chem Lab Med 2012;50:2137-40.
53. Heidbuchel H, Verhamme P, Alings M, et al. European Heart Rhythm Association Practical Guide on the use of new oral anticoagulants in patients with non-valvular atrial fibrillation. Europace 2013;15:625–51.
54. Chin PK, Wright DF, Patterson DM, et al. A proposal for dose-adjustment of dabigatran etexilate in atrial fibrillation guided by thrombin time. Br J Clin Pharmacol 2014;78:599–609.
55. Miyares MA, Davis K. Newer oral anticoagulants: a review of laboratory monitoring options and reversal agents in the hemorrhagic patient. Am J Health Syst Pharm 2012;69:1473–84.
56. Xarelto [package insert]. Titusville, NJ. Janssen Pharmaceuticals. September 2014.
57. Eliquis [package insert]. Princeton, NJ: Bristol-Meyers Squibb. June 2015.
58. Pradaxa [package insert]. Ridgefield, CT: Boehringer Ingelheim Pharmaceuticals. October 2010.
59. Savaysa [package insert]. Parsippany, NJ: Daiichi Sankyo. September 2015.
60. Michels WM, Grootendorst DC, Verduijn M, et al. Performance of the Cockcroft-Gault, MDRD, and new CKDEPI formulas in relation to GFR, age, and body size. Clin J Am Soc Nephrol 2010;5: 1003–9.
61. Poulsen BK, Grove EL, Husted SE. New oral anticoagulants: a review of the literature with particular emphasis on patients with impaired renal function. Drugs 2012;72:1739–53.
62. Hariharan S, Madabushi R. Clinical pharmacology basis of deriving dosing recommendations for dabigatran in patients with severe renal impairment. J Clin Pharmacol 2012;52:119S–25S.
63. Lehr T, Haertter S, Liesenfeld KH, et al. Dabigatran etexilate in atrial fibrillation patients with severe renal impairment: dose identification using pharmacokinetic modeling and simulation. J Clin Pharmacol 2012;52:1373–8.
64. Fox KAA, Piccini JP, Wojdyla D, et al. Prevention of stroke and systemic embolism with rivaroxaban compared with warfarin in patients with non-valvular atrial fibrillation and moderate renal impairment. Eur Heart J 2011;32:2387–94.
65. Pengo V, Crippa L, Falanga A et al. Questions and answers on the use of dabigatran and perspectives on the use of other new oral anticoagulants in patients with atrial fibrillation. A consensus document of the Italian Federation of Thrombosis Centers (FCSA). Thromb Haemost 2011;106:868–76.
66. Atkin PA, Veitch PC, Veitch EM, Ogle SJ. The epidemiology of serious adverse drug reactions among the elderly. Drugs Aging 1999;14:141–52.
67. Qato DM, Alexander GC, Conti RM, et al. Use of prescription and over-the-counter medications and dietary supplements among older adults in the United States. JAMA 2008;300:2867–78.
68. Skov J, Bladbjerg EM, Sidelmann J, et al. Plenty of pills: polypharmacy prevails in patients of a Danish anticoagulant clinic. Eur J Clin Pharmacol 2011;67:1169–74.
69. Ukena C, Bohm M, Schirmer SH. Hot topics in cardiology: data from IABP-SHOCK II, TRILOGY-ACS, WOEST, ALTIDUDE, FAME II and more. Clin Res Cardiol 2012;101):861–74.
70. Dewilde, Willem JM, Oirbans T, et al. Use of clopidogrel with or without aspirin in patients taking oral anticoagulant therapy and undergoing percutaneous coronary intervention: an open-label, randomised, controlled trial. Lancet;381:1107–15.
71. Shah SV, Gage BF. Cost-effectiveness of dabigatran for stroke prophylaxis in atrial fibrillation. Circulation 2011;123:
2562–70.
72. Sorensen SV, Kansal AR, Connolly S, et al. Cost-effectiveness of dabigatran etexilate for the prevention of stroke and systemic embolism in atrial fibrillation: a Canadian payer perspective. Thromb Haemost 2011;105:908–19.
73. Adcock AK, Lee-Iannotti JK, Aguilar MI, et al. Is dabigatran cost effective compared with warfarin for stroke prevention in atrial fibrillation?: a critically appraised topic. Neurologist 2012;18:102–7.
74. Kamel H, Johnston SC, Easton JD, Kim AS. Cost-effectiveness of dabigatran compared with warfarin for stroke prevention in patients with atrial fibrillation and prior stroke or transient ischemic attack. Stroke 2012;43:881–3.
75. Langkilde LK, Bergholdt AM, Overgaard M. Cost-effectiveness of dabigatran etexilate for stroke prevention in non-valvular atrial fibrillation. J Med Econ 2012;15:695-703.
76. Kansal AR, Sorensen SV, Gani R, et al. Cost-effectiveness of dabigatran etexilate for the prevention of stroke and systemic embolism in UK patients with atrial fibrillation. Heart 2012; 98:573–8.
77. Freeman JV, Zhu RP, Owens DK, et al. Cost-effectiveness of dabigatran compared with warfarin for stroke prevention in atrial fibrillation. Ann Intern Med 2011;154:1–11.
78. Pink J, Lane S, Pirmohamed M, Hughes DA. Dabigatran etexilate versus warfarin in management of non-valvular atrial fibrillation in UK context: quantitative benefit-harm and economic analyses. BMJ 2011;343:d6333.
79. Ali A, Bailey C, Abdelhafiz AH. Stroke prophylaxis with warfarin or dabigatran for patients with non-valvular atrial fibrillation-cost analysis. Age Ageing 2012;41:681–4.
80. Harrington AR, Armstrong EP, Nolan PE Jr, Malone DC. Cost effectiveness of apixaban, dabigatran, rivaroxaban, and warfarin for stroke prevention in atrial fibrillation. Stroke 2013;44:1676–81.
81. Amin A, Lingohr-Smith M, Bruno A, et al. Economic evaluations of medical cost differences: use of targeted-specific oral anticoagulants vs. warfarin among patients with nonvalvular atrial fibrillation and venous thromboembolism in the US. J Hematol Thrombo Dis 2015;3:209.
82. Lexicomp, Lexi-Drugs. Hudson, OH: Lexi-Comp.
83. U.S. Food and Drug Administration, Center for Drug Evaluation and Research. Apixaban NDA 202155/S-002 approval letter. Jan 30 2014. Available at http://www.accessdata.fda.gov/drugsatfda_docs/appletter/2014/202155Orig1s002ltr.pdf
84. Hinojar R, Jimenez-Natcher JJ, Fernandez-Golfin C, Zamorano JL. New oral anticoagulants: a practical guide for physicians. Eur Heart J Cardiovasc Pharmacother 2015;1:134-45.
85. Connolly SJ, Eikelboom J, Joyner C, et al. Apixaban in patiets with atrial fibrillation. N Engl J Med 2011;364;806–17.
From the Ernest Mario School of Pharmacy, Rutgers University, Piscataway, NJ.
Abstract
- Objective: To provide a clinical summary of the available data evaluating the use of direct oral anticoagulants (DOACs) in geriatric patients with nonvalvular atrial fibrillation.
- Methods: MEDLINE, Web of Science, and Google Scholar were used to identify pertinent systematic reviews, randomized controlled trials, observational studies, and pharmacokinetic studies evaluating use of DOACs in the geriatric population.
- Results: A total of 8 systemic reviews, 5 randomized controlled trials, 2 observational trials, and 5 pharmacokinetic studies of relevance were identified for inclusion in this review. The landscape of anticoagulation has dramatically changed over the past 5 years beginning with the development and marketing of an oral direct thrombin inhibitor and followed by 3 oral direct factor Xa inhibitors. Despite significant advances in this oral anticoagulation arena, many questions remain as to the best therapeutic approach in the geriatric population as the literature is lacking. This population has a higher risk of stroke; however, due to the increased risk of bleeding clinicians may often defer anticoagulant therapy due to the fear of hemorrhagic complications. Clinicians must consider the risk-benefit ratio and the associated outcomes in geriatric patients compared to other patient populations.
- Conclusions: Interpreting the available literature and understanding the benefits and limitations of the DOACs is critical when selecting the most appropriate pharmacologic strategy in geriatric patients.
Anticoagulants are among the top 5 drug classes associated with patient harm in the US [1] and are commonly reported as contributing to hospitalizations [2]. In just one quarter in 2012 alone, warfarin, dabigatran, and rivaroxaban accounted for 1734 of 50,289 adverse events reported to the Food and Drug Administration (FDA), including 233 deaths [3]. Appropriate use of anticoagulant agents and consideration of individual patient risk factors are essential to mitigate the occurrence of adverse consequences, especially in the geriatric population. This population is more likely to have risk factors for adverse drug events, for example, polypharmacy, age-related changes in pharmacokinetics and pharmacodynamics, and diminished organ function (ie, renal and hepatic) [4,5]. Another important consideration is the lack of consensus on the definition of a “geriatric” or “elderly” patient. Although many consider a chronological age of > 65 years as the defining variable for a geriatric individual, this definition does not account for overall health status [6,7]. Clinicians should consider this shortcoming when evaluating the quality of geriatric studies. For example, a study claiming to evaluate the pharmacokinetics of a drug in a geriatric population enrolling healthy subjects aged > 65 years may result in data that do not translate to clinical practice.
Compounding the concern for iatrogenic events is the frequency of anticoagulant use in the geriatric population, as several indications are found more commonly in this age-group. Stroke prevention in nonvalvular atrial fibrillation (AF), the most common arrhythmia in the elderly, is a common indication for long-term anticoagulation [8]. The prevalence of AF increases with age and is usually higher in men than in women [9,10]. AF is generally uncommon before 60 years of age, but the prevalence increases noticeably thereafter, affecting approximately 10% of the overall population by 80 years of age [11]. The median age of patients who have AF is 75 years with approximately 70% of patients between 65 and 85 years of age [8,12]. Currently in the United States, an estimated 2.3 million people are diagnosed with AF [8]. In 2020, the AF population is predicted to increase to 7.5 million individuals with an expected prevalence of 13.5% among individuals ≥ 75 years of age, and 18.2% for those ≥ 85 years of age [13]. These data underscore the importance of considering the influence of age on the balance between efficacy and safety of anticoagulant therapy.
Direct oral anticoagulants (DOACs) represent the first alternatives to warfarin in over 6 decades. Currently available products in US include apixaban, dabigatran, edoxaban, and rivaroxaban. DOACs possess many of the characteristics of an ideal anticoagulant, including predictable pharmacokinetics, a wider therapeutic window compared to warfarin, minimal drug interactions, a fixed dose, and no need for routine evaluation of coagulation parameters. The safety and efficacy of the DOACs for stroke prevention in nonvalvular AF have been substantiated in several landmark clinical trials [14–16]. Yet there are several important questions that need to be addressed, such as management of excessive anticoagulation, clinical outcome data with renally adjusted doses (an exclusion criterion in many landmark studies was a creatinine clearance of < 25–30 mL/min), whether monitoring of coagulation parameters could enhance efficacy and safety, and optimal dosing strategies in geriatric patients. This review provides clinicians a summary of data from landmark studies, post-marketing surveillance, and pharmacokinetic evaluations to support DOAC selection in the geriatric population.
Evaluating Bleeding Risk
These tools have been extensively evaluated with warfarin therapy, but their performance in predicting DOAC-related bleeding has not been definitively established. Nonetheless, until tools evaluated specifically for DOACs are developed, it is reasonable to use these for risk-prediction in combination with clinical judgment. As an example, the European Society of Cardiology guideline on the use of non–vitamin K antagonist (VKA) anticoagulants in patients with nonvalvular AF suggests that the HAS-BLED score may be used to identify risk factors for bleeding and correct those that are modifiable [20]. The HAS-BLED score is validated for VKA and non-VKA anticoagulants (early-generation oral direct thrombin inhibitor ximelgatran) [21] and is the only bleeding risk score predictive for intracranial hemorrhage [19]. In a 2013 “real world” comparison, HAS-BLED was easier to use and had better predictive accuracy that ATRIA [22].
One of the major challenges in geriatric patients is that those at highest risk for bleeding are those who would have the greatest benefit from anticoagulation [23]. The prediction scores can help clinicians balance the risk-benefit ratio for anticoagulation on a case by case basis. Although the scoring systems take into consideration several factors, including medical conditions that have been shown to significantly increase bleeding risk, including hypertension, cerebrovascular disease, ischemic stroke, serious heart disease, diabetes, renal insufficiency, alcoholism and liver disease, not all are included in every scoring scheme [23]. These conditions are more common among elderly patients, and this should be taken into account when estimating the risk-benefit ratio of oral anticoagulation [15]. Patients’ preferences should also be taken into account. It is essential for clinicians to clearly discuss treatment options with patients as data suggest that clinician and patient perceptions of anticoagulation are often mismatched [24–26].
Performance of TSOACS in Landmark Studies
Some specific differences in outcomes seen in landmark studies that may facilitate selection among the DOACs include the risk of major bleeding, risk of gastrointestinal bleeding, risk of acute coronary syndrome, exclusion of valvular heart disease, and noninferiority versus superiority as the primary endpoint when compared to warfarin.
Major Bleeding
Gastrointestinal Bleeding
Among all of the DOACs, gastrointestinal (GI) bleeding was significantly greater with dabigatran, edoxaban, and rivaroxaban when compared to warfarin (HR, 1.49; 95% CI, 1.21–1.84; HR, 1.23; 95% CI, 1.02–1.50; and HR, 1.61; 95% CI, 1.30–1.99, respectively; P < 0.05 for all) [14–16] in landmark studies. Based on these data, clinicians may consider the selection of apixaban in patients with a previous history of GI pathology. GI bleeding may be more common in elderly patients due to the potential for preexisting GI pathology and high local concentrations of drug [29]. Clemens and colleagues suggested an “anticoagulation GI stress test” may predict GI malignancy [33]. They found that patients on DOACs that presented with a GI bleed were more likely to present with a GI malignancy. As such, it is reasonable to screen patients with a fecal occult blood test within the first month after initiating TSOAC treatment and then annually.
Acute Coronary Syndrome
A higher rate of myocardial infarction was observed with dabigatran 150 mg versus warfarin (0.74% vs 0.53% per year; P = 0.048) in the RE-LY study [16]. Whether the increase in myocardial infarction was due to dabigatran as a causative agent or warfarin’s ability to reduce the risk of myocardial infarction to a larger extent compared with dabigatran is unknown. Nonetheless, it may be prudent to use an alternative therapy in patients with a history of acute coronary syndrome.
Valvular Heart Disease
The risk of stroke and systemic embolism is higher in patients with valvular heart disease [34]. Patients with moderate to severe mitral stenosis or mechanical prosthetic heart valves were excluded from the DOAC landmark studies. Dabigatran was evaluated for prevention of stroke and systemic embolism in patients with valvular heart disease in the RE-ALIGN study [35,36]. Patients were randomized to warfarin titrated to a target INR of 2 to 3 or 2.5 to 3.5 on the basis of thromboembolic risk or dabigatran 150 mg, 220 mg, or 300 mg twice daily adjusted to a targeted trough of ≥ 50 ng/mL. The trial was terminated early due to a worse primary outcome (composite of stroke, systemic embolism, myocardial infarction, and death) with dabigatran versus warfarin (HR, 3.37, 95% CI, 0.76–14.95; P = 0.11). In addition, bleeding rates (any bleeding) was significantly greater with dabigatran (27%) versus warfarin (12%) (P = 0.01). Based on these data and the lack of data with the other TSOACs, warfarin remains the standard of care for valvular heart disease [37]. In patients with a previous bioprosthetic valve with AF, patients with mitral insufficiency, or aortic stensosis, TSOACs may be considered [37].
Landmark Study Efficacy Endpoints
The primary endpoint in each of the landmark studies was a composite of stroke (ischemic or hemorrhagic) and systemic embolism. For the primary endpoint only dabigatran 150 mg twice daily and apixaban 5 mg twice daily were found to be superior to warfarin for the prevention of stroke or systemic embolism in nonvalvular AF (HR, 0.66; 95% CI, 0.53–0.82; P < 0.001 and HR, 0.66; 95% CI, 0.66–0.95; P = 0.01, respectively). Both edoxaban (60 mg and 30 mg daily) and rivaroxaban were noninferior to warfarin for the primary endpoint. In terms of ischemic stroke, only dabigatran 150 mg twice daily was superior to warfarin for the reduction in ischemic stroke in patients with nonvalvular AF (HR, 0.76; 95% CI, 0.60–0.98; P = 0.03) [19]. All of the DOACs demonstrated a reduction in hemorrhagic stroke.
TSOAC Use in Elderly Patitents
Pharmacokinetic Evaluations
Several pharmacokinetic studies have evaluated the influence of age on DOAC disposition. In a study evaluating the influence of age on apixaban disposition, the area under the concentration-time curve to infinity was 32% higher in the elderly (aged 65 years or older) compared to the younger subjects (< age 40 years) [38]. These data provide the rationale for dosage adjustment in individuals aged 80 years or older with either low body mass (weight less than or equal to 60 kg) or renal impairment (serum creatinine 1.5 mg/dL or higher). In a pharmacokinetic study evaluating dabigatran in patients > 65 years of age, the time to steady state ranged from 2 to 3 days, correlating to a half-life of 12 to 14 hours, and peak concentrations (256 ng/mL females, 255 ng/mL males) were reached after a median of 3 hours (range, 2.0–4.0 hours) [39]. These data suggest a 1.7- to twofold increase in bioavailability. The area under the curve of rivaroxaban was significantly higher in subjects > 75 years versus subjects 18-45 years, while total and renal clearance were decreased [40].However, the time to maximum factor Xa inhibition and Cmax were not influenced by age.
Clinical Evaluations
Dabigatran
In a post-hoc analysis of the RE-LY trial, Eikelboom and colleagues found that patients 75 years of age and older treated with dabigatran 150 mg twice daily had a greater incidence of GI bleeding irrespective of renal function compared with those on warfarin (1.85%/year vs. 1.25%/year; P < 0.001) [29]. A higher risk in major bleeding also was seen in dabigatran patients (5.10% versus 4.37%; P = 0.07). As a result, the 2012 Beer’s Criteria lists dabigatran as a potentially inappropriate medication. An analysis was conducted of 134,414 elderly Medicare patients (defined as age > 65 years) with 37,587 person-years of follow-up who were treated with dabigatran or warfarin [44]. Approximately 60% of patients included in the analysis were over age 75 years. Dabigatran was associated with a significant reduction in ischemic stroke: HR 0.80 (CI 0.67–0.96); intracranial hemorrhage: HR 0.34 (CI 0.26–0.46); and death: HR 0.86 (CI 0.77–0.96) when compared with warfarin. As in the Eikelboom study, major gastrointestinal bleeding was significantly increased with dabigatran (HR, 1.28 [95% CI, 1.14–1.44]).
Rivaroxaban
For rivaroxaban, a subgroup analysis of patients ≥ 75 years in the ROCKET-AF trial reported similar rates of major bleeding (HR, 1.11; 95% CI, 0.92–1.34) with rivaroxaban compared with warfarin [31]. Clinically relevant non-major bleeding was significantly higher for patients aged ≥ 75 years compared with patients aged < 75 years (P = 0.01).
Apixaban
Halvorsen and colleagues found that age did not influence the benefits of apixaban in terms of efficacy and safety [47]. In the cohort of patients aged 75 years or older, major bleeding was significantly reduced compared to warfarin (HR, 0.64; 95% CI, 0.52–0.79). The safety benefits persisted even in the setting of age greater than 75 years and renal impairment. A significant reduction in major bleeding (HR, 0.35; 95% CI, 0.14–0.86) was seen in elderly patients with a CrCl; ≤ 30 mL/min (n = 221) treated with apixaban versus warfarin. Similarly, in elderly patients with a CrCl 30 to 50 mL/min (n = 1898) a significant reduction in major bleeding was reported (HR, 0.53; 95% CI, 0.37–0.76). These data are consistent with a meta-regression analysis that found a linear relationship between the relative risk of major bleeding and the magnitude of renal excretion for the DOACs (r2=0.66, P = 0.03) [48]. In this analysis, apixaban had the most favorable outcomes in terms of major bleeding compared to the other DOACs and also has the least dependence on renal function for clearance. In a pooled analysis of data from landmark trials, Ng and colleagues found that in elderly patients (defined as age > 75 years) with nonvalvular AF, only apixaban was associated with a significant reduction in both stroke and major hemorrhage (Figure 1) [49,50].
Edoxaban
Kato and colleagues performed a subgroup analysis of patients aged 75 years or older enrolled in the ENGAGE TIMI 48 study [50]. Currently the results are only published in abstract form. Regardless of treatment, the risk of major bleeding and stroke significantly increased with age (P < 0.001). An absolute risk reduction in major bleeding was reported with both 60 mg and 30 mg of edoxaban versus warfarin (4.0%/year and 2.2%/year versus 4.8%/year, respectively; no P value provided).
Therapeuti Drug Monitoring
Collectively, the data on assessment of the anticoagulant activity of DOACs using coagulation assays is evolving. These tests include but are not limited to prothrombin time (PT), activated partial thromboplastin time (aPTT), thrombin clotting time (TT), dilute TT, activated clotting time (ACT), anti factor Xa, and ecarin clotting time (ECT) assays. Although routine monitoring is not desirable, the ability to assess degree of anticoagulation in select patient populations may prove beneficial. Future studies are essential to confirm whether assessing DOAC activity using coagulation assays in vulnerable populations such as the elderly improves clinical outcomes. Several reviews on this subject matter have been published [51–55]. The reader is encouraged to review these data as there are significant limitations to currently available assays and incorrect interpretation may lead to suboptimal treatment decisions.
Renal and Hepatic Dysfunction
Depending on the specific agents, DOACs renal clearance varies from 27% to 80% [56–59]. Clinical trials often use the Cockcroft-Gault formula (CG) based on actual body weight to estimate renal function. Landmark trials evaluating the DOACs differed in their strategy for estimation of renal function using CG. For example, RE-LY and ROCKET-AF used actual body weight for the estimation of renal function, while ARISTOTLE did not specify which body weight to use. Estimation of renal function or glomerular filtration rate (GFR) by CG is frequently in discordance with actual renal function in the elderly [60]. MDRD (modification of diet in renal disease) and Chronic Kidney Disease-Epidemiology Collaboration (CKD-EPI) are also common estimations that provide an estimate of GFR. In a cross-sectional study, comparing the CG, MDRD, and EPI formulas in a clinical setting, data from potential kidney donors and adult patients who underwent a GFR measurement revealed that MDRD has the smallest mean bias. The influence of age was the absolute bias for estimation of renal function for all formulas. CG is additionally influenced by body weight and body mass index. When compared to CG, MDRD actually reported more accurate predictor of GFR in adults < 70 years old [61]. However, package inserts recommend dose adjustments based on estimation of CrCl using CG formula. This poses a problem in adjusting DOAC doses in elderly patients who are subject to overestimation of renal function with this antiquated equation. Among elderly patients with renal impairment, discordance between estimated and actual renal function was higher for dabigatran and rivaroxaban than for apixaban dosages [61].
Renal excretion of unchanged dabigatran is the predominant pathway for elimination (~80%) [58]. The FDA-approved dosing strategy in the US for dabigatran is 150 mg twice daily in patients with a CrCl ≥ 30 mL/min, 75 mg twice daily in patients with severe renal impairment (CrCl 15–30 mL/min), and is contraindicated in patients with a CrCl < 15 mL/min [58]. By comparison, the Canadian and the European Medicines Agency have listed patients with a CrCl < 30 mL/min (severe renal impairment) as a contraindication for use. The US-approved dosage for severe renal impairment was derived during the approval phase of dabigatran using a simulation pharmacokinetic model [62,63]. The dosage was estimated by pharmacokinetic simulation to provide similar Cmax and Cmin concentrations compared to the 150 mg twice-daily dosage in moderate renal impairment. Compared to patients with CrCl ≥ 80 mL/min, there was a 1.29- and a 1.47-fold increase in dabigatran trough plasma concentration in the CrCl 50–80 mL/min patients and the CrCl 30–50 mL/min patients, respectively. There have been many postmarketing reports of hemorrhage with dabigatran [36,84,85]. Although reporting bias is likely due to the novelty of the agent, clinicians may take key clinical pearls away from these reports. Patients often had risk factors, including low body weight, renal impairment, and polypharmacy with interacting drugs (eg, amiodarone). These risk factors are also important with the other DOACs.
A subgroup analysis of ROCKET-AF evaluating rivaroxaban 15 mg daily in patients with a CrCl of 30–49 mL/min did not identify any differences in endpoints with the exception of fatal bleeding, which occurred less often with rivaroxaban (0.28%/yr vs. 0.74%/yr; P = 0.047) [64].
Monitoring of renal function is essential to mitigate the risk of drug accumulation. Clinicians should consider obtaining a baseline renal assessment with annual reassessments in patients with normal (CrCl ≥ 80 mL/min) or mild (CrCl 50–79 mL/min) renal impairment, and 2 to 3 times per year in patients with moderate (CrCl 30–49 mL/min) renal impairment [65]. A summary of renal dose adjustments for DOAC therapy may be found in Table 5 [56–59].
In addition to renal function, hepatic impairment can also affect the metabolism of anticoagulants. Severe hepatic impairment can lead to prolonged PT. Therefore, patients who have liver dysfunction and are treated with anticoagulation have increased risk of hemorrhagic events. Large pivotal trials on the key indications of dabigatran, apixaban, and rivaroxaban excluded patients with significant signs of hepatic impairment. Table 5 provides dosing recommendations for the different DOACs in the setting of hepatic impairment [56–59].
Polypharmacy And The Potential For Adverse Consequences
Costs And Cost-Effectiveness of DOACS
With the high burden of AF and the aging population, analysis of cost and value is an important consideration [76]. There are limited publications comparing the cost-effectiveness between the anticoagulation options. However, numerous cost-effectiveness studies have evaluated the individual DOACs [71–79]. Overall, the studies suggest that the DOACs are a cost-effective alternative to warfarin in the general and elderly populations. One analysis reported that dabigatran may not be cost-effective in patients with a low CHADS2 score (≤ 2) [71].
Harrington et al [80] compared the cost-effectiveness of dabigatran, rivaroxaban, and apixaban versus warfarin. This cost-effectiveness study used published clinical trial data to build a decision model, and results indicated that for patients ≥ 70 years of age with an increased risk for stroke, normal renal function, and no previous contraindications to anticoagulant therapy, apixaban 5 mg, dabigatran 150 mg, and rivaroxaban 20 mg were cost-effective substitutes for warfarin for the prevention of stroke in nonvalvular AF [80]. Apixaban was the preferred anticoagulant for their hypothetical cohort of 70-year-old patients with nonvalvular AF, as it was most likely to be the cost-effective treatment option at all willing-to-pay thresholds > $40,000 per quality-adjusted life-year gained [76,81].
Prescription costs may vary depending on payor and level of insurance. If a patient does not have prescription insurance, the annual price of generic warfarin is roughly $200 to $360, depending on dosage. Approximate annual costs for the DOACs are greater than 20 times the cost of warfarin (apixaban $4500, dabigatran $4500, and rivaroxaban $4800) [82]. However, most patients on these medications are over 65 years old and have prescription coverage through Medicare Part D. Of note, patients may have more of a burden if or when they reach the “donut hole” coverage gap. Currently, once patients spend $2960 (for 2015) and $3310 (for 2016) on covered drugs they will fall into the donut hole unless they qualify for additional assistance. At this point Medicare Part D will reimburse 45% of the cost of the newer anticoagulants since generics are currently unavailable. As a result, individual affordability may become an issue. Further complicating the scenario is the inability to apply coupon and rebate cards in the setting of government-funded prescription coverage. Clinicians should discuss these issues with their patients to help select the most valuable therapy.
Conclusions And Recommendations
Corresponding author: Luigi Brunetti, PharmD, MPH, Rutgers University, 160 Frelinghuysen Rd, Piscataway, NJ 08854, [email protected].
From the Ernest Mario School of Pharmacy, Rutgers University, Piscataway, NJ.
Abstract
- Objective: To provide a clinical summary of the available data evaluating the use of direct oral anticoagulants (DOACs) in geriatric patients with nonvalvular atrial fibrillation.
- Methods: MEDLINE, Web of Science, and Google Scholar were used to identify pertinent systematic reviews, randomized controlled trials, observational studies, and pharmacokinetic studies evaluating use of DOACs in the geriatric population.
- Results: A total of 8 systemic reviews, 5 randomized controlled trials, 2 observational trials, and 5 pharmacokinetic studies of relevance were identified for inclusion in this review. The landscape of anticoagulation has dramatically changed over the past 5 years beginning with the development and marketing of an oral direct thrombin inhibitor and followed by 3 oral direct factor Xa inhibitors. Despite significant advances in this oral anticoagulation arena, many questions remain as to the best therapeutic approach in the geriatric population as the literature is lacking. This population has a higher risk of stroke; however, due to the increased risk of bleeding clinicians may often defer anticoagulant therapy due to the fear of hemorrhagic complications. Clinicians must consider the risk-benefit ratio and the associated outcomes in geriatric patients compared to other patient populations.
- Conclusions: Interpreting the available literature and understanding the benefits and limitations of the DOACs is critical when selecting the most appropriate pharmacologic strategy in geriatric patients.
Anticoagulants are among the top 5 drug classes associated with patient harm in the US [1] and are commonly reported as contributing to hospitalizations [2]. In just one quarter in 2012 alone, warfarin, dabigatran, and rivaroxaban accounted for 1734 of 50,289 adverse events reported to the Food and Drug Administration (FDA), including 233 deaths [3]. Appropriate use of anticoagulant agents and consideration of individual patient risk factors are essential to mitigate the occurrence of adverse consequences, especially in the geriatric population. This population is more likely to have risk factors for adverse drug events, for example, polypharmacy, age-related changes in pharmacokinetics and pharmacodynamics, and diminished organ function (ie, renal and hepatic) [4,5]. Another important consideration is the lack of consensus on the definition of a “geriatric” or “elderly” patient. Although many consider a chronological age of > 65 years as the defining variable for a geriatric individual, this definition does not account for overall health status [6,7]. Clinicians should consider this shortcoming when evaluating the quality of geriatric studies. For example, a study claiming to evaluate the pharmacokinetics of a drug in a geriatric population enrolling healthy subjects aged > 65 years may result in data that do not translate to clinical practice.
Compounding the concern for iatrogenic events is the frequency of anticoagulant use in the geriatric population, as several indications are found more commonly in this age-group. Stroke prevention in nonvalvular atrial fibrillation (AF), the most common arrhythmia in the elderly, is a common indication for long-term anticoagulation [8]. The prevalence of AF increases with age and is usually higher in men than in women [9,10]. AF is generally uncommon before 60 years of age, but the prevalence increases noticeably thereafter, affecting approximately 10% of the overall population by 80 years of age [11]. The median age of patients who have AF is 75 years with approximately 70% of patients between 65 and 85 years of age [8,12]. Currently in the United States, an estimated 2.3 million people are diagnosed with AF [8]. In 2020, the AF population is predicted to increase to 7.5 million individuals with an expected prevalence of 13.5% among individuals ≥ 75 years of age, and 18.2% for those ≥ 85 years of age [13]. These data underscore the importance of considering the influence of age on the balance between efficacy and safety of anticoagulant therapy.
Direct oral anticoagulants (DOACs) represent the first alternatives to warfarin in over 6 decades. Currently available products in US include apixaban, dabigatran, edoxaban, and rivaroxaban. DOACs possess many of the characteristics of an ideal anticoagulant, including predictable pharmacokinetics, a wider therapeutic window compared to warfarin, minimal drug interactions, a fixed dose, and no need for routine evaluation of coagulation parameters. The safety and efficacy of the DOACs for stroke prevention in nonvalvular AF have been substantiated in several landmark clinical trials [14–16]. Yet there are several important questions that need to be addressed, such as management of excessive anticoagulation, clinical outcome data with renally adjusted doses (an exclusion criterion in many landmark studies was a creatinine clearance of < 25–30 mL/min), whether monitoring of coagulation parameters could enhance efficacy and safety, and optimal dosing strategies in geriatric patients. This review provides clinicians a summary of data from landmark studies, post-marketing surveillance, and pharmacokinetic evaluations to support DOAC selection in the geriatric population.
Evaluating Bleeding Risk
These tools have been extensively evaluated with warfarin therapy, but their performance in predicting DOAC-related bleeding has not been definitively established. Nonetheless, until tools evaluated specifically for DOACs are developed, it is reasonable to use these for risk-prediction in combination with clinical judgment. As an example, the European Society of Cardiology guideline on the use of non–vitamin K antagonist (VKA) anticoagulants in patients with nonvalvular AF suggests that the HAS-BLED score may be used to identify risk factors for bleeding and correct those that are modifiable [20]. The HAS-BLED score is validated for VKA and non-VKA anticoagulants (early-generation oral direct thrombin inhibitor ximelgatran) [21] and is the only bleeding risk score predictive for intracranial hemorrhage [19]. In a 2013 “real world” comparison, HAS-BLED was easier to use and had better predictive accuracy that ATRIA [22].
One of the major challenges in geriatric patients is that those at highest risk for bleeding are those who would have the greatest benefit from anticoagulation [23]. The prediction scores can help clinicians balance the risk-benefit ratio for anticoagulation on a case by case basis. Although the scoring systems take into consideration several factors, including medical conditions that have been shown to significantly increase bleeding risk, including hypertension, cerebrovascular disease, ischemic stroke, serious heart disease, diabetes, renal insufficiency, alcoholism and liver disease, not all are included in every scoring scheme [23]. These conditions are more common among elderly patients, and this should be taken into account when estimating the risk-benefit ratio of oral anticoagulation [15]. Patients’ preferences should also be taken into account. It is essential for clinicians to clearly discuss treatment options with patients as data suggest that clinician and patient perceptions of anticoagulation are often mismatched [24–26].
Performance of TSOACS in Landmark Studies
Some specific differences in outcomes seen in landmark studies that may facilitate selection among the DOACs include the risk of major bleeding, risk of gastrointestinal bleeding, risk of acute coronary syndrome, exclusion of valvular heart disease, and noninferiority versus superiority as the primary endpoint when compared to warfarin.
Major Bleeding
Gastrointestinal Bleeding
Among all of the DOACs, gastrointestinal (GI) bleeding was significantly greater with dabigatran, edoxaban, and rivaroxaban when compared to warfarin (HR, 1.49; 95% CI, 1.21–1.84; HR, 1.23; 95% CI, 1.02–1.50; and HR, 1.61; 95% CI, 1.30–1.99, respectively; P < 0.05 for all) [14–16] in landmark studies. Based on these data, clinicians may consider the selection of apixaban in patients with a previous history of GI pathology. GI bleeding may be more common in elderly patients due to the potential for preexisting GI pathology and high local concentrations of drug [29]. Clemens and colleagues suggested an “anticoagulation GI stress test” may predict GI malignancy [33]. They found that patients on DOACs that presented with a GI bleed were more likely to present with a GI malignancy. As such, it is reasonable to screen patients with a fecal occult blood test within the first month after initiating TSOAC treatment and then annually.
Acute Coronary Syndrome
A higher rate of myocardial infarction was observed with dabigatran 150 mg versus warfarin (0.74% vs 0.53% per year; P = 0.048) in the RE-LY study [16]. Whether the increase in myocardial infarction was due to dabigatran as a causative agent or warfarin’s ability to reduce the risk of myocardial infarction to a larger extent compared with dabigatran is unknown. Nonetheless, it may be prudent to use an alternative therapy in patients with a history of acute coronary syndrome.
Valvular Heart Disease
The risk of stroke and systemic embolism is higher in patients with valvular heart disease [34]. Patients with moderate to severe mitral stenosis or mechanical prosthetic heart valves were excluded from the DOAC landmark studies. Dabigatran was evaluated for prevention of stroke and systemic embolism in patients with valvular heart disease in the RE-ALIGN study [35,36]. Patients were randomized to warfarin titrated to a target INR of 2 to 3 or 2.5 to 3.5 on the basis of thromboembolic risk or dabigatran 150 mg, 220 mg, or 300 mg twice daily adjusted to a targeted trough of ≥ 50 ng/mL. The trial was terminated early due to a worse primary outcome (composite of stroke, systemic embolism, myocardial infarction, and death) with dabigatran versus warfarin (HR, 3.37, 95% CI, 0.76–14.95; P = 0.11). In addition, bleeding rates (any bleeding) was significantly greater with dabigatran (27%) versus warfarin (12%) (P = 0.01). Based on these data and the lack of data with the other TSOACs, warfarin remains the standard of care for valvular heart disease [37]. In patients with a previous bioprosthetic valve with AF, patients with mitral insufficiency, or aortic stensosis, TSOACs may be considered [37].
Landmark Study Efficacy Endpoints
The primary endpoint in each of the landmark studies was a composite of stroke (ischemic or hemorrhagic) and systemic embolism. For the primary endpoint only dabigatran 150 mg twice daily and apixaban 5 mg twice daily were found to be superior to warfarin for the prevention of stroke or systemic embolism in nonvalvular AF (HR, 0.66; 95% CI, 0.53–0.82; P < 0.001 and HR, 0.66; 95% CI, 0.66–0.95; P = 0.01, respectively). Both edoxaban (60 mg and 30 mg daily) and rivaroxaban were noninferior to warfarin for the primary endpoint. In terms of ischemic stroke, only dabigatran 150 mg twice daily was superior to warfarin for the reduction in ischemic stroke in patients with nonvalvular AF (HR, 0.76; 95% CI, 0.60–0.98; P = 0.03) [19]. All of the DOACs demonstrated a reduction in hemorrhagic stroke.
TSOAC Use in Elderly Patitents
Pharmacokinetic Evaluations
Several pharmacokinetic studies have evaluated the influence of age on DOAC disposition. In a study evaluating the influence of age on apixaban disposition, the area under the concentration-time curve to infinity was 32% higher in the elderly (aged 65 years or older) compared to the younger subjects (< age 40 years) [38]. These data provide the rationale for dosage adjustment in individuals aged 80 years or older with either low body mass (weight less than or equal to 60 kg) or renal impairment (serum creatinine 1.5 mg/dL or higher). In a pharmacokinetic study evaluating dabigatran in patients > 65 years of age, the time to steady state ranged from 2 to 3 days, correlating to a half-life of 12 to 14 hours, and peak concentrations (256 ng/mL females, 255 ng/mL males) were reached after a median of 3 hours (range, 2.0–4.0 hours) [39]. These data suggest a 1.7- to twofold increase in bioavailability. The area under the curve of rivaroxaban was significantly higher in subjects > 75 years versus subjects 18-45 years, while total and renal clearance were decreased [40].However, the time to maximum factor Xa inhibition and Cmax were not influenced by age.
Clinical Evaluations
Dabigatran
In a post-hoc analysis of the RE-LY trial, Eikelboom and colleagues found that patients 75 years of age and older treated with dabigatran 150 mg twice daily had a greater incidence of GI bleeding irrespective of renal function compared with those on warfarin (1.85%/year vs. 1.25%/year; P < 0.001) [29]. A higher risk in major bleeding also was seen in dabigatran patients (5.10% versus 4.37%; P = 0.07). As a result, the 2012 Beer’s Criteria lists dabigatran as a potentially inappropriate medication. An analysis was conducted of 134,414 elderly Medicare patients (defined as age > 65 years) with 37,587 person-years of follow-up who were treated with dabigatran or warfarin [44]. Approximately 60% of patients included in the analysis were over age 75 years. Dabigatran was associated with a significant reduction in ischemic stroke: HR 0.80 (CI 0.67–0.96); intracranial hemorrhage: HR 0.34 (CI 0.26–0.46); and death: HR 0.86 (CI 0.77–0.96) when compared with warfarin. As in the Eikelboom study, major gastrointestinal bleeding was significantly increased with dabigatran (HR, 1.28 [95% CI, 1.14–1.44]).
Rivaroxaban
For rivaroxaban, a subgroup analysis of patients ≥ 75 years in the ROCKET-AF trial reported similar rates of major bleeding (HR, 1.11; 95% CI, 0.92–1.34) with rivaroxaban compared with warfarin [31]. Clinically relevant non-major bleeding was significantly higher for patients aged ≥ 75 years compared with patients aged < 75 years (P = 0.01).
Apixaban
Halvorsen and colleagues found that age did not influence the benefits of apixaban in terms of efficacy and safety [47]. In the cohort of patients aged 75 years or older, major bleeding was significantly reduced compared to warfarin (HR, 0.64; 95% CI, 0.52–0.79). The safety benefits persisted even in the setting of age greater than 75 years and renal impairment. A significant reduction in major bleeding (HR, 0.35; 95% CI, 0.14–0.86) was seen in elderly patients with a CrCl; ≤ 30 mL/min (n = 221) treated with apixaban versus warfarin. Similarly, in elderly patients with a CrCl 30 to 50 mL/min (n = 1898) a significant reduction in major bleeding was reported (HR, 0.53; 95% CI, 0.37–0.76). These data are consistent with a meta-regression analysis that found a linear relationship between the relative risk of major bleeding and the magnitude of renal excretion for the DOACs (r2=0.66, P = 0.03) [48]. In this analysis, apixaban had the most favorable outcomes in terms of major bleeding compared to the other DOACs and also has the least dependence on renal function for clearance. In a pooled analysis of data from landmark trials, Ng and colleagues found that in elderly patients (defined as age > 75 years) with nonvalvular AF, only apixaban was associated with a significant reduction in both stroke and major hemorrhage (Figure 1) [49,50].
Edoxaban
Kato and colleagues performed a subgroup analysis of patients aged 75 years or older enrolled in the ENGAGE TIMI 48 study [50]. Currently the results are only published in abstract form. Regardless of treatment, the risk of major bleeding and stroke significantly increased with age (P < 0.001). An absolute risk reduction in major bleeding was reported with both 60 mg and 30 mg of edoxaban versus warfarin (4.0%/year and 2.2%/year versus 4.8%/year, respectively; no P value provided).
Therapeuti Drug Monitoring
Collectively, the data on assessment of the anticoagulant activity of DOACs using coagulation assays is evolving. These tests include but are not limited to prothrombin time (PT), activated partial thromboplastin time (aPTT), thrombin clotting time (TT), dilute TT, activated clotting time (ACT), anti factor Xa, and ecarin clotting time (ECT) assays. Although routine monitoring is not desirable, the ability to assess degree of anticoagulation in select patient populations may prove beneficial. Future studies are essential to confirm whether assessing DOAC activity using coagulation assays in vulnerable populations such as the elderly improves clinical outcomes. Several reviews on this subject matter have been published [51–55]. The reader is encouraged to review these data as there are significant limitations to currently available assays and incorrect interpretation may lead to suboptimal treatment decisions.
Renal and Hepatic Dysfunction
Depending on the specific agents, DOACs renal clearance varies from 27% to 80% [56–59]. Clinical trials often use the Cockcroft-Gault formula (CG) based on actual body weight to estimate renal function. Landmark trials evaluating the DOACs differed in their strategy for estimation of renal function using CG. For example, RE-LY and ROCKET-AF used actual body weight for the estimation of renal function, while ARISTOTLE did not specify which body weight to use. Estimation of renal function or glomerular filtration rate (GFR) by CG is frequently in discordance with actual renal function in the elderly [60]. MDRD (modification of diet in renal disease) and Chronic Kidney Disease-Epidemiology Collaboration (CKD-EPI) are also common estimations that provide an estimate of GFR. In a cross-sectional study, comparing the CG, MDRD, and EPI formulas in a clinical setting, data from potential kidney donors and adult patients who underwent a GFR measurement revealed that MDRD has the smallest mean bias. The influence of age was the absolute bias for estimation of renal function for all formulas. CG is additionally influenced by body weight and body mass index. When compared to CG, MDRD actually reported more accurate predictor of GFR in adults < 70 years old [61]. However, package inserts recommend dose adjustments based on estimation of CrCl using CG formula. This poses a problem in adjusting DOAC doses in elderly patients who are subject to overestimation of renal function with this antiquated equation. Among elderly patients with renal impairment, discordance between estimated and actual renal function was higher for dabigatran and rivaroxaban than for apixaban dosages [61].
Renal excretion of unchanged dabigatran is the predominant pathway for elimination (~80%) [58]. The FDA-approved dosing strategy in the US for dabigatran is 150 mg twice daily in patients with a CrCl ≥ 30 mL/min, 75 mg twice daily in patients with severe renal impairment (CrCl 15–30 mL/min), and is contraindicated in patients with a CrCl < 15 mL/min [58]. By comparison, the Canadian and the European Medicines Agency have listed patients with a CrCl < 30 mL/min (severe renal impairment) as a contraindication for use. The US-approved dosage for severe renal impairment was derived during the approval phase of dabigatran using a simulation pharmacokinetic model [62,63]. The dosage was estimated by pharmacokinetic simulation to provide similar Cmax and Cmin concentrations compared to the 150 mg twice-daily dosage in moderate renal impairment. Compared to patients with CrCl ≥ 80 mL/min, there was a 1.29- and a 1.47-fold increase in dabigatran trough plasma concentration in the CrCl 50–80 mL/min patients and the CrCl 30–50 mL/min patients, respectively. There have been many postmarketing reports of hemorrhage with dabigatran [36,84,85]. Although reporting bias is likely due to the novelty of the agent, clinicians may take key clinical pearls away from these reports. Patients often had risk factors, including low body weight, renal impairment, and polypharmacy with interacting drugs (eg, amiodarone). These risk factors are also important with the other DOACs.
A subgroup analysis of ROCKET-AF evaluating rivaroxaban 15 mg daily in patients with a CrCl of 30–49 mL/min did not identify any differences in endpoints with the exception of fatal bleeding, which occurred less often with rivaroxaban (0.28%/yr vs. 0.74%/yr; P = 0.047) [64].
Monitoring of renal function is essential to mitigate the risk of drug accumulation. Clinicians should consider obtaining a baseline renal assessment with annual reassessments in patients with normal (CrCl ≥ 80 mL/min) or mild (CrCl 50–79 mL/min) renal impairment, and 2 to 3 times per year in patients with moderate (CrCl 30–49 mL/min) renal impairment [65]. A summary of renal dose adjustments for DOAC therapy may be found in Table 5 [56–59].
In addition to renal function, hepatic impairment can also affect the metabolism of anticoagulants. Severe hepatic impairment can lead to prolonged PT. Therefore, patients who have liver dysfunction and are treated with anticoagulation have increased risk of hemorrhagic events. Large pivotal trials on the key indications of dabigatran, apixaban, and rivaroxaban excluded patients with significant signs of hepatic impairment. Table 5 provides dosing recommendations for the different DOACs in the setting of hepatic impairment [56–59].
Polypharmacy And The Potential For Adverse Consequences
Costs And Cost-Effectiveness of DOACS
With the high burden of AF and the aging population, analysis of cost and value is an important consideration [76]. There are limited publications comparing the cost-effectiveness between the anticoagulation options. However, numerous cost-effectiveness studies have evaluated the individual DOACs [71–79]. Overall, the studies suggest that the DOACs are a cost-effective alternative to warfarin in the general and elderly populations. One analysis reported that dabigatran may not be cost-effective in patients with a low CHADS2 score (≤ 2) [71].
Harrington et al [80] compared the cost-effectiveness of dabigatran, rivaroxaban, and apixaban versus warfarin. This cost-effectiveness study used published clinical trial data to build a decision model, and results indicated that for patients ≥ 70 years of age with an increased risk for stroke, normal renal function, and no previous contraindications to anticoagulant therapy, apixaban 5 mg, dabigatran 150 mg, and rivaroxaban 20 mg were cost-effective substitutes for warfarin for the prevention of stroke in nonvalvular AF [80]. Apixaban was the preferred anticoagulant for their hypothetical cohort of 70-year-old patients with nonvalvular AF, as it was most likely to be the cost-effective treatment option at all willing-to-pay thresholds > $40,000 per quality-adjusted life-year gained [76,81].
Prescription costs may vary depending on payor and level of insurance. If a patient does not have prescription insurance, the annual price of generic warfarin is roughly $200 to $360, depending on dosage. Approximate annual costs for the DOACs are greater than 20 times the cost of warfarin (apixaban $4500, dabigatran $4500, and rivaroxaban $4800) [82]. However, most patients on these medications are over 65 years old and have prescription coverage through Medicare Part D. Of note, patients may have more of a burden if or when they reach the “donut hole” coverage gap. Currently, once patients spend $2960 (for 2015) and $3310 (for 2016) on covered drugs they will fall into the donut hole unless they qualify for additional assistance. At this point Medicare Part D will reimburse 45% of the cost of the newer anticoagulants since generics are currently unavailable. As a result, individual affordability may become an issue. Further complicating the scenario is the inability to apply coupon and rebate cards in the setting of government-funded prescription coverage. Clinicians should discuss these issues with their patients to help select the most valuable therapy.
Conclusions And Recommendations
Corresponding author: Luigi Brunetti, PharmD, MPH, Rutgers University, 160 Frelinghuysen Rd, Piscataway, NJ 08854, [email protected].
1. Fanikos J, Stapinski C, Koo S, et al. Medication errors associated with anticoagulant therapy in the hospital. Am J Cardiol 2004;94:532–5.
2. Budnitz DS, Lovegrove MC, Shehab N, Richards CL. Emergency hospitalizations for adverse drug events in older Americans. N Engl J Med2011;365:2002–12.
3. Institute for Safe Medication Practices. QuarterWatch. 9 January 2013. Available at http://www.ismp.org/quarterwatch/pdfs/2012Q2.pdf.
4. Hajjar ER, Hanlon JT, Artz MB, Let al. Adverse drug reaction risk factors in older outpatients. Am J Geriatr Pharmacother 2003;1:82–9.
5. Gurwitz JH, Field TS, Harrold LR, et al. Incidence and preventability of adverse drug events among older persons in the ambulatory setting. JAMA 2003;289:1107–16.
6. Singh S. Defining ‘elderly’ in clinical practice guidelines for pharmacotherapy. Pharm Pract 2014;12:489.
7. Singh S, Bajorek B. Pharmacotherapy in the aging patient: The impact of age per se (a review). Ageing Res Rev 2015 Jul 28. pii: S1568-1637(15)30008-8.
8. Go AS, Hylek EM, Phillips KA, et al. Prevalence of diagnosed atrial fibrillation in adults: national implications for rhythm management and stroke prevention: the anticoagulation and risk factors in atrial fibrillation (ATRIA) Study. JAMA 2001;285:2370–5.
9. Lip GY, Brechin CM, Lane DA. The global burden of atrial fibrillation and stroke: a systematic review of the epidemiology of atrial fibrillation in regions outside North America and Europe Chest 2012;142:1489–98.
10. Camm AJ, Lip GY, De Caterina R, et al. 2012 Focused update of the ESC guidelines for the management of atrial fibrillation: an update of the 2010 ESC guidelines for the management of atrial fibrillation-developed with the special contribution of the European Heart Rhythm Association Europace 2012;14:1385–413.
11. Kannel WB, Benjamin EJ. Status of the epidemiology of atrial fibrillation. Med Clin North Am 2008;92:17–40.
12. Chugh SS, Havmoeller R, Narayanan K, et al. Worldwide epidemiology of atrial fibrillation: a global burden 2010 study. Circulation 2014;129:837-47.
13. Miyasaka Y, Barnes ME, Gersh BJ, et al. Secular trends in incidence of atrial fibrillation in Olmsted County, Minnesota, 1980 to 2000, and implications on the projections for future prevalence. Circulation 2006;114:119–25.
14. Connolly SJ, Ezekowitz MD, Yusuf S, et al. Dabigatran versus warfarin in patients with atrial fibrillation. N Engl J Med 2009;361:1139–51.
15. Patel MR, Mahaffey KW, Garg J, et al. Rivaroxaban versus warfarin in nonvalvular atrial fibrillation. N Engl J Med 2011;365:883–91.
16. Granger CB, Alexander JH, McMurray JJ, et al. Apixaban versus warfarin in patients with atrial fibrillation. N Engl J Med 2011;365:981–92.
17. Apostolakis S, Lane DA, Guo Y, et al. Performance of the HEMORR2HAGES, ATRIA, and HAS-BLED Bleeding Risk–Prediction Scores in Patients With Atrial Fibrillation Undergoing Anticoagulation: The AMADEUS (Evaluating the Use of SR34006 Compared to Warfarin or Acenocoumarol in Patients With Atrial Fibrillation) Study. J Am Coll Cardiol 2012;60:861–7.
18. Fang MC, Go AS, Chang Y, et al. A new risk scheme to predict warfarin-associated hemorrhage: the ATRIA (Anticoagulation and Risk Factors in Atrial Fibrillation) study. J Am Coll Cardiol 2011;58:395–401.
19. Pisters R, Lane DA, Nieuwlaat R, et al. A novel user-friendly score (has-bled) to assess 1-year risk of major bleeding in patients with atrial fibrillation: The Euro Heart Survey. Chest 2010;138:1093–100.
20. Heidbuchel H, Verhamme P, Alings M, et al. Updated European Heart Rhythm Association Practical Guide on the use of non-vitamin K antagonist anticoagulants in patients with non-valvular atrial fibrillation. Europace 2015;17:1467–507.
21. Lip GY, Frison L, Halperin JL, Lane DA. Comparative validation of a novel risk score for predicting bleeding risk in anticoagulated patients with atrial fibrillation: the HAS-BLED (Hypertension, Abnormal Renal/Liver Function, Stroke, Bleeding History or Predisposition, Labile INR, Elderly, Drugs/Alcohol Concomitantly) score. J Am Coll Cardiol 2011;57:173–80.
22. Roldán V, Marín F, Fernández H, et al. Predictive value of the HAS-BLED and ATRIA bleeding scores for the risk of serious bleeding in a "real-world" population with atrial fibrillation receiving anticoagulant therapy. Chest 2013;43:179–84.
23. Robert-Ebadi H, Le Gal G, Righini M. Use of anticoagulants in elderly patients: practical recommendations. Clin Interv Aging 2009;4:165–77.
24. Barcellona D, Contu P, Sorano GG, et al. The management of oral anticoagulant therapy: the patient's point of view. Thromb Haemost 2000;83:49–53.
25. Lancaster TR, Singer DE, Sheehan MA, et al. The impact of long-term warfarin therapy on quality of life. Evidence from a randomized trial. Boston Area Anticoagulation Trial for Atrial Fibrillation Investigators. Arch Intern Med 1991;151:1944–9.
26. Devereaux PJ, Anderson DR, Gardner MJ, et al. Differences between perspectives of physicians and patients on anticoagulation in patients with atrial fibrillation: observational study. BMJ 2001;323:1218–22.
27. Giugliano RP, Ruff CT, Braunwald E, Murphy SA. Edoxaban versus warfarin in patients with atrial fibrillation. N Engl J Med 2013;369:2093–104.
28. Barco S, Cheung YW, Eikelboom JW, Coppens M. New oral anticoagulants in elderly patients. Best Pract Res Clin Haematol 2013;26:215–24
29. Eikelboom JW, Wallentin L, Connolly SJ, et al. Risk of bleeding with 2 doses of dabigatran compared with warfarin in older and younger patients with atrial fibrillation: an analysis of the randomized evaluation of long-term anticoagulant therapy (RE-LY) trial. Circulation 2011;123:2363–72.
30. Coppens M, Eikelboom JW, Ezekowitz M, et al. Dabigatran versus warfarin in very elderly patients with atrial fibrillation: results from the RE-LY trial. Abstract. Circulation 2012;126:A15l537.
31. Halperin JL, Wojdyla D, Piccini JP, et al. Efficacy and safety of rivaroxaban compared with warfarin among elderly patients with nonvalvular atrial fibrillation in the ROCKET-AF trial. Abstract. Stroke 2012;43:A148.
32. Ruff CT, Giugliano RP, Braunwald E, et al. Comparison of the efficacy and safety of new oral anticoagulants with warfarin in patients with atrial fibrillation: a meta-analysis of randomised trials. Lancet 2014;383:955–62
33. Clemens A, Strack A, Noack H, et al. Anticoagulant-related gastrointestinal bleeding—could this facilitate early detection of benign or malignant gastrointestinal lesions? Ann Med 2014;46:672–8.
34. Petty GW, Khandheria BK, Whisnant JP, et al. Predictors of cerebrovascular events and death among patients with valvular heart disease: A population-based study. Stroke 2000;31:2628–35.
35. Eikelboom JW, Connolly SJ, Brueckmann M, et al. Dabigatran versus warfarin in patients with mechanical heart valves. N Engl J Med 2013;369:1206–14.
36. Schomburg JL, Medina EM, Lahti MT, Bianco RW. Dabigatran versus warfarin after mechanical mitral valve replacement in the swine model. J Invest Surg 2012;25:150–5.
37. Douketis J, Bell AD, Eikelboom J, Liew A. Approach to the new oral anticoagulants in family practice: part 2: addressing frequently asked questions. Can Fam Physician 2014;60:997–1001.
38. Frost CE, Nepal S, Barrett YC, LaCreta F. Effects of age and gender on the singledose pharmacokinetics (PK) and pharmacodynamics (PD) of apixaban. Abstract. J Thromb Haemost 2009;7(Suppl 2):PP-MO-407..
39. Stangier J, Stahle H, Rathgen K et al. Pharmacokinetics and pharmacodynamics of the direct oral thrombin inhibitor dabigatran in healthy elderly subjects. Clin Pharmacokinet 2008;47:47–59.
40. Kubitza D, Becka M, Mueck W. The effect of extreme age, and gender on the pharmacology and tolerability of rivaroxaban, an oral direct factor Xa inhibitor. Blood 2006;108: Abstract 905.
41. Siegal DM, Crowther MA. Acute management of bleeding in patients on novel oral anticoagulants. Eur Heart J 2013;34:489–98.
42. Evans A, Kalra L. Are the results of randomized controlled trials on anticoagulation in patients with atrial fibrillation generalizable to clinical practice? Arch Intern Med 2001;161:1443–7.
43. Harper P, Young L, Merriman E. Bleeding risk with dabigatran in the frail elderly. N Engl J Med 2012;366:864–6.
44. Graham DJ, Reichman ME, Wernecke M, et al. Cardiovascular, bleeding, and mortality risks in elderly Medicare patients treated with dabigatran or warfarin for nonvalvular atrial fibrillation. Circulation 2015;131:157–64.
45. Avgil-Tsadok M, Jackevicius CA, Essebag V, et al. Dabigatran use in elderly patients with atrial fibrillation. Thromb Haemost 2015;115(1).
46. Uchino K, Hernandez AV. Dabigatran association with higher risk of acute coronary events: meta-analysis of noninferiority randomized controlled trials. Arch Intern Med 2012;172:397–402.
47. Halvorsen S, Atar D, Yang H, et al. Efficacy and safety of apixaban compared with warfarin according to age for stroke prevention in atrial fibrillation: observations from the ARISTOTLE trial. Eur Heart J 2014;35:1864–72.
48. Lega JC, Bertoletti L, Gremillet C, et al. Consistency of safety profile of new oral anticoagulants in patients with renal failure. J Thromb Haemost 2014;12:337–43.
49. Ng KH, Hart RG, Eikelboom JW. Anticoagulation in patients aged ≥ 75 years with atrial fibrillation: role of novel oral anticoagulants. Cardiol Ther 2013;2:135–49.
50. Kato ET, Guigliano RP, Ruff CT, et al. Efficacy and safety of edoxaban for the management of elderly patients with atrial fibrillation: Engage-AF TIMI 48. Circulation 2014;130:A16612.
51. Tripodi A. The laboratory and the new oral anticoagulants. Clin Chem 2013;59:353–62.
52. Tripodi A, Di Iorio G, Lippi G, et al. Position paper on laboratory testing for patients taking new oral anticoagulants. Consensus document of FCSA, SIMeL, SIBioC and CISMEL. Clin Chem Lab Med 2012;50:2137-40.
53. Heidbuchel H, Verhamme P, Alings M, et al. European Heart Rhythm Association Practical Guide on the use of new oral anticoagulants in patients with non-valvular atrial fibrillation. Europace 2013;15:625–51.
54. Chin PK, Wright DF, Patterson DM, et al. A proposal for dose-adjustment of dabigatran etexilate in atrial fibrillation guided by thrombin time. Br J Clin Pharmacol 2014;78:599–609.
55. Miyares MA, Davis K. Newer oral anticoagulants: a review of laboratory monitoring options and reversal agents in the hemorrhagic patient. Am J Health Syst Pharm 2012;69:1473–84.
56. Xarelto [package insert]. Titusville, NJ. Janssen Pharmaceuticals. September 2014.
57. Eliquis [package insert]. Princeton, NJ: Bristol-Meyers Squibb. June 2015.
58. Pradaxa [package insert]. Ridgefield, CT: Boehringer Ingelheim Pharmaceuticals. October 2010.
59. Savaysa [package insert]. Parsippany, NJ: Daiichi Sankyo. September 2015.
60. Michels WM, Grootendorst DC, Verduijn M, et al. Performance of the Cockcroft-Gault, MDRD, and new CKDEPI formulas in relation to GFR, age, and body size. Clin J Am Soc Nephrol 2010;5: 1003–9.
61. Poulsen BK, Grove EL, Husted SE. New oral anticoagulants: a review of the literature with particular emphasis on patients with impaired renal function. Drugs 2012;72:1739–53.
62. Hariharan S, Madabushi R. Clinical pharmacology basis of deriving dosing recommendations for dabigatran in patients with severe renal impairment. J Clin Pharmacol 2012;52:119S–25S.
63. Lehr T, Haertter S, Liesenfeld KH, et al. Dabigatran etexilate in atrial fibrillation patients with severe renal impairment: dose identification using pharmacokinetic modeling and simulation. J Clin Pharmacol 2012;52:1373–8.
64. Fox KAA, Piccini JP, Wojdyla D, et al. Prevention of stroke and systemic embolism with rivaroxaban compared with warfarin in patients with non-valvular atrial fibrillation and moderate renal impairment. Eur Heart J 2011;32:2387–94.
65. Pengo V, Crippa L, Falanga A et al. Questions and answers on the use of dabigatran and perspectives on the use of other new oral anticoagulants in patients with atrial fibrillation. A consensus document of the Italian Federation of Thrombosis Centers (FCSA). Thromb Haemost 2011;106:868–76.
66. Atkin PA, Veitch PC, Veitch EM, Ogle SJ. The epidemiology of serious adverse drug reactions among the elderly. Drugs Aging 1999;14:141–52.
67. Qato DM, Alexander GC, Conti RM, et al. Use of prescription and over-the-counter medications and dietary supplements among older adults in the United States. JAMA 2008;300:2867–78.
68. Skov J, Bladbjerg EM, Sidelmann J, et al. Plenty of pills: polypharmacy prevails in patients of a Danish anticoagulant clinic. Eur J Clin Pharmacol 2011;67:1169–74.
69. Ukena C, Bohm M, Schirmer SH. Hot topics in cardiology: data from IABP-SHOCK II, TRILOGY-ACS, WOEST, ALTIDUDE, FAME II and more. Clin Res Cardiol 2012;101):861–74.
70. Dewilde, Willem JM, Oirbans T, et al. Use of clopidogrel with or without aspirin in patients taking oral anticoagulant therapy and undergoing percutaneous coronary intervention: an open-label, randomised, controlled trial. Lancet;381:1107–15.
71. Shah SV, Gage BF. Cost-effectiveness of dabigatran for stroke prophylaxis in atrial fibrillation. Circulation 2011;123:
2562–70.
72. Sorensen SV, Kansal AR, Connolly S, et al. Cost-effectiveness of dabigatran etexilate for the prevention of stroke and systemic embolism in atrial fibrillation: a Canadian payer perspective. Thromb Haemost 2011;105:908–19.
73. Adcock AK, Lee-Iannotti JK, Aguilar MI, et al. Is dabigatran cost effective compared with warfarin for stroke prevention in atrial fibrillation?: a critically appraised topic. Neurologist 2012;18:102–7.
74. Kamel H, Johnston SC, Easton JD, Kim AS. Cost-effectiveness of dabigatran compared with warfarin for stroke prevention in patients with atrial fibrillation and prior stroke or transient ischemic attack. Stroke 2012;43:881–3.
75. Langkilde LK, Bergholdt AM, Overgaard M. Cost-effectiveness of dabigatran etexilate for stroke prevention in non-valvular atrial fibrillation. J Med Econ 2012;15:695-703.
76. Kansal AR, Sorensen SV, Gani R, et al. Cost-effectiveness of dabigatran etexilate for the prevention of stroke and systemic embolism in UK patients with atrial fibrillation. Heart 2012; 98:573–8.
77. Freeman JV, Zhu RP, Owens DK, et al. Cost-effectiveness of dabigatran compared with warfarin for stroke prevention in atrial fibrillation. Ann Intern Med 2011;154:1–11.
78. Pink J, Lane S, Pirmohamed M, Hughes DA. Dabigatran etexilate versus warfarin in management of non-valvular atrial fibrillation in UK context: quantitative benefit-harm and economic analyses. BMJ 2011;343:d6333.
79. Ali A, Bailey C, Abdelhafiz AH. Stroke prophylaxis with warfarin or dabigatran for patients with non-valvular atrial fibrillation-cost analysis. Age Ageing 2012;41:681–4.
80. Harrington AR, Armstrong EP, Nolan PE Jr, Malone DC. Cost effectiveness of apixaban, dabigatran, rivaroxaban, and warfarin for stroke prevention in atrial fibrillation. Stroke 2013;44:1676–81.
81. Amin A, Lingohr-Smith M, Bruno A, et al. Economic evaluations of medical cost differences: use of targeted-specific oral anticoagulants vs. warfarin among patients with nonvalvular atrial fibrillation and venous thromboembolism in the US. J Hematol Thrombo Dis 2015;3:209.
82. Lexicomp, Lexi-Drugs. Hudson, OH: Lexi-Comp.
83. U.S. Food and Drug Administration, Center for Drug Evaluation and Research. Apixaban NDA 202155/S-002 approval letter. Jan 30 2014. Available at http://www.accessdata.fda.gov/drugsatfda_docs/appletter/2014/202155Orig1s002ltr.pdf
84. Hinojar R, Jimenez-Natcher JJ, Fernandez-Golfin C, Zamorano JL. New oral anticoagulants: a practical guide for physicians. Eur Heart J Cardiovasc Pharmacother 2015;1:134-45.
85. Connolly SJ, Eikelboom J, Joyner C, et al. Apixaban in patiets with atrial fibrillation. N Engl J Med 2011;364;806–17.
1. Fanikos J, Stapinski C, Koo S, et al. Medication errors associated with anticoagulant therapy in the hospital. Am J Cardiol 2004;94:532–5.
2. Budnitz DS, Lovegrove MC, Shehab N, Richards CL. Emergency hospitalizations for adverse drug events in older Americans. N Engl J Med2011;365:2002–12.
3. Institute for Safe Medication Practices. QuarterWatch. 9 January 2013. Available at http://www.ismp.org/quarterwatch/pdfs/2012Q2.pdf.
4. Hajjar ER, Hanlon JT, Artz MB, Let al. Adverse drug reaction risk factors in older outpatients. Am J Geriatr Pharmacother 2003;1:82–9.
5. Gurwitz JH, Field TS, Harrold LR, et al. Incidence and preventability of adverse drug events among older persons in the ambulatory setting. JAMA 2003;289:1107–16.
6. Singh S. Defining ‘elderly’ in clinical practice guidelines for pharmacotherapy. Pharm Pract 2014;12:489.
7. Singh S, Bajorek B. Pharmacotherapy in the aging patient: The impact of age per se (a review). Ageing Res Rev 2015 Jul 28. pii: S1568-1637(15)30008-8.
8. Go AS, Hylek EM, Phillips KA, et al. Prevalence of diagnosed atrial fibrillation in adults: national implications for rhythm management and stroke prevention: the anticoagulation and risk factors in atrial fibrillation (ATRIA) Study. JAMA 2001;285:2370–5.
9. Lip GY, Brechin CM, Lane DA. The global burden of atrial fibrillation and stroke: a systematic review of the epidemiology of atrial fibrillation in regions outside North America and Europe Chest 2012;142:1489–98.
10. Camm AJ, Lip GY, De Caterina R, et al. 2012 Focused update of the ESC guidelines for the management of atrial fibrillation: an update of the 2010 ESC guidelines for the management of atrial fibrillation-developed with the special contribution of the European Heart Rhythm Association Europace 2012;14:1385–413.
11. Kannel WB, Benjamin EJ. Status of the epidemiology of atrial fibrillation. Med Clin North Am 2008;92:17–40.
12. Chugh SS, Havmoeller R, Narayanan K, et al. Worldwide epidemiology of atrial fibrillation: a global burden 2010 study. Circulation 2014;129:837-47.
13. Miyasaka Y, Barnes ME, Gersh BJ, et al. Secular trends in incidence of atrial fibrillation in Olmsted County, Minnesota, 1980 to 2000, and implications on the projections for future prevalence. Circulation 2006;114:119–25.
14. Connolly SJ, Ezekowitz MD, Yusuf S, et al. Dabigatran versus warfarin in patients with atrial fibrillation. N Engl J Med 2009;361:1139–51.
15. Patel MR, Mahaffey KW, Garg J, et al. Rivaroxaban versus warfarin in nonvalvular atrial fibrillation. N Engl J Med 2011;365:883–91.
16. Granger CB, Alexander JH, McMurray JJ, et al. Apixaban versus warfarin in patients with atrial fibrillation. N Engl J Med 2011;365:981–92.
17. Apostolakis S, Lane DA, Guo Y, et al. Performance of the HEMORR2HAGES, ATRIA, and HAS-BLED Bleeding Risk–Prediction Scores in Patients With Atrial Fibrillation Undergoing Anticoagulation: The AMADEUS (Evaluating the Use of SR34006 Compared to Warfarin or Acenocoumarol in Patients With Atrial Fibrillation) Study. J Am Coll Cardiol 2012;60:861–7.
18. Fang MC, Go AS, Chang Y, et al. A new risk scheme to predict warfarin-associated hemorrhage: the ATRIA (Anticoagulation and Risk Factors in Atrial Fibrillation) study. J Am Coll Cardiol 2011;58:395–401.
19. Pisters R, Lane DA, Nieuwlaat R, et al. A novel user-friendly score (has-bled) to assess 1-year risk of major bleeding in patients with atrial fibrillation: The Euro Heart Survey. Chest 2010;138:1093–100.
20. Heidbuchel H, Verhamme P, Alings M, et al. Updated European Heart Rhythm Association Practical Guide on the use of non-vitamin K antagonist anticoagulants in patients with non-valvular atrial fibrillation. Europace 2015;17:1467–507.
21. Lip GY, Frison L, Halperin JL, Lane DA. Comparative validation of a novel risk score for predicting bleeding risk in anticoagulated patients with atrial fibrillation: the HAS-BLED (Hypertension, Abnormal Renal/Liver Function, Stroke, Bleeding History or Predisposition, Labile INR, Elderly, Drugs/Alcohol Concomitantly) score. J Am Coll Cardiol 2011;57:173–80.
22. Roldán V, Marín F, Fernández H, et al. Predictive value of the HAS-BLED and ATRIA bleeding scores for the risk of serious bleeding in a "real-world" population with atrial fibrillation receiving anticoagulant therapy. Chest 2013;43:179–84.
23. Robert-Ebadi H, Le Gal G, Righini M. Use of anticoagulants in elderly patients: practical recommendations. Clin Interv Aging 2009;4:165–77.
24. Barcellona D, Contu P, Sorano GG, et al. The management of oral anticoagulant therapy: the patient's point of view. Thromb Haemost 2000;83:49–53.
25. Lancaster TR, Singer DE, Sheehan MA, et al. The impact of long-term warfarin therapy on quality of life. Evidence from a randomized trial. Boston Area Anticoagulation Trial for Atrial Fibrillation Investigators. Arch Intern Med 1991;151:1944–9.
26. Devereaux PJ, Anderson DR, Gardner MJ, et al. Differences between perspectives of physicians and patients on anticoagulation in patients with atrial fibrillation: observational study. BMJ 2001;323:1218–22.
27. Giugliano RP, Ruff CT, Braunwald E, Murphy SA. Edoxaban versus warfarin in patients with atrial fibrillation. N Engl J Med 2013;369:2093–104.
28. Barco S, Cheung YW, Eikelboom JW, Coppens M. New oral anticoagulants in elderly patients. Best Pract Res Clin Haematol 2013;26:215–24
29. Eikelboom JW, Wallentin L, Connolly SJ, et al. Risk of bleeding with 2 doses of dabigatran compared with warfarin in older and younger patients with atrial fibrillation: an analysis of the randomized evaluation of long-term anticoagulant therapy (RE-LY) trial. Circulation 2011;123:2363–72.
30. Coppens M, Eikelboom JW, Ezekowitz M, et al. Dabigatran versus warfarin in very elderly patients with atrial fibrillation: results from the RE-LY trial. Abstract. Circulation 2012;126:A15l537.
31. Halperin JL, Wojdyla D, Piccini JP, et al. Efficacy and safety of rivaroxaban compared with warfarin among elderly patients with nonvalvular atrial fibrillation in the ROCKET-AF trial. Abstract. Stroke 2012;43:A148.
32. Ruff CT, Giugliano RP, Braunwald E, et al. Comparison of the efficacy and safety of new oral anticoagulants with warfarin in patients with atrial fibrillation: a meta-analysis of randomised trials. Lancet 2014;383:955–62
33. Clemens A, Strack A, Noack H, et al. Anticoagulant-related gastrointestinal bleeding—could this facilitate early detection of benign or malignant gastrointestinal lesions? Ann Med 2014;46:672–8.
34. Petty GW, Khandheria BK, Whisnant JP, et al. Predictors of cerebrovascular events and death among patients with valvular heart disease: A population-based study. Stroke 2000;31:2628–35.
35. Eikelboom JW, Connolly SJ, Brueckmann M, et al. Dabigatran versus warfarin in patients with mechanical heart valves. N Engl J Med 2013;369:1206–14.
36. Schomburg JL, Medina EM, Lahti MT, Bianco RW. Dabigatran versus warfarin after mechanical mitral valve replacement in the swine model. J Invest Surg 2012;25:150–5.
37. Douketis J, Bell AD, Eikelboom J, Liew A. Approach to the new oral anticoagulants in family practice: part 2: addressing frequently asked questions. Can Fam Physician 2014;60:997–1001.
38. Frost CE, Nepal S, Barrett YC, LaCreta F. Effects of age and gender on the singledose pharmacokinetics (PK) and pharmacodynamics (PD) of apixaban. Abstract. J Thromb Haemost 2009;7(Suppl 2):PP-MO-407..
39. Stangier J, Stahle H, Rathgen K et al. Pharmacokinetics and pharmacodynamics of the direct oral thrombin inhibitor dabigatran in healthy elderly subjects. Clin Pharmacokinet 2008;47:47–59.
40. Kubitza D, Becka M, Mueck W. The effect of extreme age, and gender on the pharmacology and tolerability of rivaroxaban, an oral direct factor Xa inhibitor. Blood 2006;108: Abstract 905.
41. Siegal DM, Crowther MA. Acute management of bleeding in patients on novel oral anticoagulants. Eur Heart J 2013;34:489–98.
42. Evans A, Kalra L. Are the results of randomized controlled trials on anticoagulation in patients with atrial fibrillation generalizable to clinical practice? Arch Intern Med 2001;161:1443–7.
43. Harper P, Young L, Merriman E. Bleeding risk with dabigatran in the frail elderly. N Engl J Med 2012;366:864–6.
44. Graham DJ, Reichman ME, Wernecke M, et al. Cardiovascular, bleeding, and mortality risks in elderly Medicare patients treated with dabigatran or warfarin for nonvalvular atrial fibrillation. Circulation 2015;131:157–64.
45. Avgil-Tsadok M, Jackevicius CA, Essebag V, et al. Dabigatran use in elderly patients with atrial fibrillation. Thromb Haemost 2015;115(1).
46. Uchino K, Hernandez AV. Dabigatran association with higher risk of acute coronary events: meta-analysis of noninferiority randomized controlled trials. Arch Intern Med 2012;172:397–402.
47. Halvorsen S, Atar D, Yang H, et al. Efficacy and safety of apixaban compared with warfarin according to age for stroke prevention in atrial fibrillation: observations from the ARISTOTLE trial. Eur Heart J 2014;35:1864–72.
48. Lega JC, Bertoletti L, Gremillet C, et al. Consistency of safety profile of new oral anticoagulants in patients with renal failure. J Thromb Haemost 2014;12:337–43.
49. Ng KH, Hart RG, Eikelboom JW. Anticoagulation in patients aged ≥ 75 years with atrial fibrillation: role of novel oral anticoagulants. Cardiol Ther 2013;2:135–49.
50. Kato ET, Guigliano RP, Ruff CT, et al. Efficacy and safety of edoxaban for the management of elderly patients with atrial fibrillation: Engage-AF TIMI 48. Circulation 2014;130:A16612.
51. Tripodi A. The laboratory and the new oral anticoagulants. Clin Chem 2013;59:353–62.
52. Tripodi A, Di Iorio G, Lippi G, et al. Position paper on laboratory testing for patients taking new oral anticoagulants. Consensus document of FCSA, SIMeL, SIBioC and CISMEL. Clin Chem Lab Med 2012;50:2137-40.
53. Heidbuchel H, Verhamme P, Alings M, et al. European Heart Rhythm Association Practical Guide on the use of new oral anticoagulants in patients with non-valvular atrial fibrillation. Europace 2013;15:625–51.
54. Chin PK, Wright DF, Patterson DM, et al. A proposal for dose-adjustment of dabigatran etexilate in atrial fibrillation guided by thrombin time. Br J Clin Pharmacol 2014;78:599–609.
55. Miyares MA, Davis K. Newer oral anticoagulants: a review of laboratory monitoring options and reversal agents in the hemorrhagic patient. Am J Health Syst Pharm 2012;69:1473–84.
56. Xarelto [package insert]. Titusville, NJ. Janssen Pharmaceuticals. September 2014.
57. Eliquis [package insert]. Princeton, NJ: Bristol-Meyers Squibb. June 2015.
58. Pradaxa [package insert]. Ridgefield, CT: Boehringer Ingelheim Pharmaceuticals. October 2010.
59. Savaysa [package insert]. Parsippany, NJ: Daiichi Sankyo. September 2015.
60. Michels WM, Grootendorst DC, Verduijn M, et al. Performance of the Cockcroft-Gault, MDRD, and new CKDEPI formulas in relation to GFR, age, and body size. Clin J Am Soc Nephrol 2010;5: 1003–9.
61. Poulsen BK, Grove EL, Husted SE. New oral anticoagulants: a review of the literature with particular emphasis on patients with impaired renal function. Drugs 2012;72:1739–53.
62. Hariharan S, Madabushi R. Clinical pharmacology basis of deriving dosing recommendations for dabigatran in patients with severe renal impairment. J Clin Pharmacol 2012;52:119S–25S.
63. Lehr T, Haertter S, Liesenfeld KH, et al. Dabigatran etexilate in atrial fibrillation patients with severe renal impairment: dose identification using pharmacokinetic modeling and simulation. J Clin Pharmacol 2012;52:1373–8.
64. Fox KAA, Piccini JP, Wojdyla D, et al. Prevention of stroke and systemic embolism with rivaroxaban compared with warfarin in patients with non-valvular atrial fibrillation and moderate renal impairment. Eur Heart J 2011;32:2387–94.
65. Pengo V, Crippa L, Falanga A et al. Questions and answers on the use of dabigatran and perspectives on the use of other new oral anticoagulants in patients with atrial fibrillation. A consensus document of the Italian Federation of Thrombosis Centers (FCSA). Thromb Haemost 2011;106:868–76.
66. Atkin PA, Veitch PC, Veitch EM, Ogle SJ. The epidemiology of serious adverse drug reactions among the elderly. Drugs Aging 1999;14:141–52.
67. Qato DM, Alexander GC, Conti RM, et al. Use of prescription and over-the-counter medications and dietary supplements among older adults in the United States. JAMA 2008;300:2867–78.
68. Skov J, Bladbjerg EM, Sidelmann J, et al. Plenty of pills: polypharmacy prevails in patients of a Danish anticoagulant clinic. Eur J Clin Pharmacol 2011;67:1169–74.
69. Ukena C, Bohm M, Schirmer SH. Hot topics in cardiology: data from IABP-SHOCK II, TRILOGY-ACS, WOEST, ALTIDUDE, FAME II and more. Clin Res Cardiol 2012;101):861–74.
70. Dewilde, Willem JM, Oirbans T, et al. Use of clopidogrel with or without aspirin in patients taking oral anticoagulant therapy and undergoing percutaneous coronary intervention: an open-label, randomised, controlled trial. Lancet;381:1107–15.
71. Shah SV, Gage BF. Cost-effectiveness of dabigatran for stroke prophylaxis in atrial fibrillation. Circulation 2011;123:
2562–70.
72. Sorensen SV, Kansal AR, Connolly S, et al. Cost-effectiveness of dabigatran etexilate for the prevention of stroke and systemic embolism in atrial fibrillation: a Canadian payer perspective. Thromb Haemost 2011;105:908–19.
73. Adcock AK, Lee-Iannotti JK, Aguilar MI, et al. Is dabigatran cost effective compared with warfarin for stroke prevention in atrial fibrillation?: a critically appraised topic. Neurologist 2012;18:102–7.
74. Kamel H, Johnston SC, Easton JD, Kim AS. Cost-effectiveness of dabigatran compared with warfarin for stroke prevention in patients with atrial fibrillation and prior stroke or transient ischemic attack. Stroke 2012;43:881–3.
75. Langkilde LK, Bergholdt AM, Overgaard M. Cost-effectiveness of dabigatran etexilate for stroke prevention in non-valvular atrial fibrillation. J Med Econ 2012;15:695-703.
76. Kansal AR, Sorensen SV, Gani R, et al. Cost-effectiveness of dabigatran etexilate for the prevention of stroke and systemic embolism in UK patients with atrial fibrillation. Heart 2012; 98:573–8.
77. Freeman JV, Zhu RP, Owens DK, et al. Cost-effectiveness of dabigatran compared with warfarin for stroke prevention in atrial fibrillation. Ann Intern Med 2011;154:1–11.
78. Pink J, Lane S, Pirmohamed M, Hughes DA. Dabigatran etexilate versus warfarin in management of non-valvular atrial fibrillation in UK context: quantitative benefit-harm and economic analyses. BMJ 2011;343:d6333.
79. Ali A, Bailey C, Abdelhafiz AH. Stroke prophylaxis with warfarin or dabigatran for patients with non-valvular atrial fibrillation-cost analysis. Age Ageing 2012;41:681–4.
80. Harrington AR, Armstrong EP, Nolan PE Jr, Malone DC. Cost effectiveness of apixaban, dabigatran, rivaroxaban, and warfarin for stroke prevention in atrial fibrillation. Stroke 2013;44:1676–81.
81. Amin A, Lingohr-Smith M, Bruno A, et al. Economic evaluations of medical cost differences: use of targeted-specific oral anticoagulants vs. warfarin among patients with nonvalvular atrial fibrillation and venous thromboembolism in the US. J Hematol Thrombo Dis 2015;3:209.
82. Lexicomp, Lexi-Drugs. Hudson, OH: Lexi-Comp.
83. U.S. Food and Drug Administration, Center for Drug Evaluation and Research. Apixaban NDA 202155/S-002 approval letter. Jan 30 2014. Available at http://www.accessdata.fda.gov/drugsatfda_docs/appletter/2014/202155Orig1s002ltr.pdf
84. Hinojar R, Jimenez-Natcher JJ, Fernandez-Golfin C, Zamorano JL. New oral anticoagulants: a practical guide for physicians. Eur Heart J Cardiovasc Pharmacother 2015;1:134-45.
85. Connolly SJ, Eikelboom J, Joyner C, et al. Apixaban in patiets with atrial fibrillation. N Engl J Med 2011;364;806–17.