Affiliations
Department of Medicine, Massachusetts General Hospital, Boston, Massachusetts
Department of Medicine, Massachusetts General Hospital, Cambridge, Massachusetts
National Bureau of Economic Research, Cambridge, Massachusetts
Given name(s)
Daniel
Family name
Blumenthal
Degrees
MD, MBA

Hospital Value‐Based Purchasing

Article Type
Changed
Mon, 01/02/2017 - 19:34
Display Headline
Hospital value‐based purchasing

The Centers for Medicaid and Medicare Services' (CMS) Hospital Inpatient Value‐Based Purchasing (VBP) Program, which was signed into law as part of the Patient Protection and Affordable Care Act of 2010, aims to incentivize inpatient providers to deliver high‐value, as opposed to high‐volume, healthcare.[1] Beginning on October 1, 2012, the start of the 2013 fiscal year (FY), hospitals participating in the VBP program became eligible for a variety of performance‐based incentive payments from CMS. These payments are based on an acute care hospital's ability to meet performance measurements in 6 care domains: (1) patient safety, (2) care coordination, (3) clinical processes and outcomes, (4) population or community health, (5) efficiency and cost reduction, and (6) patient‐ and caregiver‐centered experience.[2] The VBP program's ultimate purpose is to enable CMS to improve the health of Medicare beneficiaries by purchasing better care for them at a lower cost. These 3 characteristics of careimproved health, improved care, and lower costsare the foundation of CMS' conception of value.[1, 2] They are closely related to an economic conception of value, which is the difference between an intervention's benefit and its cost.

Although in principle not a new idea, the formal mandate of hospitals to provide high‐value healthcare through financial incentives marks an important change in Medicare and Medicaid policy. In this opportune review of VBP, we first discuss the relevant historical changes in the reimbursement environment of US hospitals that have set the stage for VBP. We then describe the structure of CMS' VBP program, with a focus on which facilities are eligible to participate in the program, the specific outcomes measured and incentivized, how rewards and penalties are allocated, and how the program will be funded. In an effort to anticipate some of the issues that lie ahead, we then highlight a number of potential challenges to the success of VBP, and discuss how VBP will impact the delivery and reimbursement of inpatient care services. We conclude by examining how the VBP program is likely to evolve over time.

HISTORICAL CONTEXT FOR VBP

Over the last decade, CMS has embarked on a number of initiatives to incentivize the provision of higher‐quality and more cost‐effective care. For example, in 2003, CMS implemented a national pay‐for‐performance (P4P) pilot project called the Premier Hospital Quality Incentive Demonstration (HQID).[3, 4] HQID, which ran for 6 years, tracked and rewarded the performance of 216 hospitals in 6 healthcare service domains: (1) acute myocardial infarction (AMI), (2) congestive heart failure (CHF), (3) pneumonia, (4) coronary artery bypass graft surgery, (5) hip and knee replacement surgery, and (6) perioperative management of surgical patients (including prevention of surgical site infections).[4] CMS then introduced its Hospital Compare Web site in 2005 to facilitate public reporting of hospital‐level quality outcomes.[3, 5] This Web site provides the public with access to data on hospital performance across a wide array of measures of process quality, clinical outcomes, spending, and resource utilization.[5] Next, in October 2008, CMS stopped reimbursing hospitals for a number of costly and common hospital‐acquired complications, including hospital‐acquired bloodstream infections and urinary tract infections, patient falls, and pressure ulcers.[3, 6] VBP is the latest and most comprehensive step that CMS has taken in its decade‐long effort to shift from volume to value‐based compensation for inpatient care.

Although CMS appears fully invested in using performance incentives to increase healthcare value, existing evidence of the effects of P4P on patient outcomes remains quite mixed.[7] On one hand, an analysis of an inpatient P4P program sponsored by the United Kingdom's National Health Service's (NHS) suggests that P4P may improve quality and save lives; indeed, hospitals that participated in the NHS P4P program significantly reduced inpatient mortality from pneumonia, saving an estimated 890 lives.[8] Additional empirical work suggests that the HQID was also associated with early improvements in healthcare quality.[9] However, a subsequent long‐term analysis found that participation in HQID had no discernible effect on 30‐day mortality rates.[10] Moreover, a meta‐analysis of P4P incentives for individual practitioners found few methodologically robust studies of P4P for clinicians and concluded that P4P's effects on individual practice patterns and outcomes remain largely uncertain.[11]

VBP: STRUCTURE AND DESIGN

This section reviews the structure of the VBP program. We describe current VBP eligibility criteria and sources of funding for the program, how hospitals participating in VBP are evaluated, and how VBP incentives for FY 2013 have been calculated.

Hospital Eligibility for VBP

All acute care hospitals in the United States (excluding Maryland) that are not psychiatric hospitals, rehabilitation hospitals, long‐term care facilities, children's hospitals, or cancer hospitals are eligible to participate in VBP in FY 2013 (full eligibility criteria is outlined in Table 1). For FY 2013, CMS chose to incentivize measures in just 2 care domains: (1) clinical processes of care and (2) patient experience of care. To be eligible for VBP in FY 2013, a hospital must report at least 10 cases each in at least 4 of 12 measures included in the clinical processes of care domain (Table 2), and/or must have at least 100 completed Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS). Designed and validated by CMS, the HCAHPS survey provides hospitals with a standardized instrument for gathering information about patient satisfaction with, and perspectives on, their hospital care.[12] HCAHPS will be used to assess 8 patient experience of care measures (Table 3).

Inclusion and Exclusion Criteria for the Inpatient Value‐Based Purchasing Program in Fiscal Year 2013
  • NOTE: Abbreviations: HCAHPS, Hospital Consumer Assessment of Healthcare Providers and Systems; HHS, US Department of Health and Human Services; VBP, Value‐Based Purchasing.

Inclusion criteria
Acute care hospital
Located in all 50 US states or District of Columbia (excluding Maryland)
Has at least 10 cases in at least 4 of 12 clinical process of care measures and/or at least 100 completed HCAHPS surveys
Exclusion criteria
Psychiatric, rehabilitation, long‐term care, children's or cancer hospital
Does not participate in Hospital Inpatient Quality Reporting Program during the VBP performance period
Cited by the Secretary of HHS for significant patient safety violations during performance period
Hospital does not meet minimum reporting requirements for number of cases, process measures, and surveys needed to participate in VBP
Clinical Process of Care Measures Evaluated by Value‐Based Purchasing in Fiscal Year 2013
Disease Process Process of Care Measure
  • NOTE: Mortality measures to be added in fiscal year 2014: acute myocardial infarction, congestive heart failure, pneumonia.

Acute myocardial infarction Fibrinolytic therapy received within 30 minutes of hospital arrival
Primary percutaneous coronary intervention received within 90 minutes of hospital arrival
Heart failure Discharge instructions provided
Pneumonia Blood cultures performed in the emergency department prior to initial antibiotic received in hospital
Initial antibiotic selection for community‐acquired pneumonia in immunocompetent patient
Healthcare‐associated infections Prophylactic antibiotic received within 1 hour prior to surgical incision
Prophylactic antibiotic selection for surgical patients
Prophylactic antibiotics discontinued within 24 hours after surgery ends
Cardiac surgery patients with controlled 6:00 am postoperative serum glucose
Surgeries Surgery patients on ‐blocker prior to arrival that received ‐blocker during perioperative period
Surgery patients with recommended venous thromboembolism prophylaxis ordered
Surgery patients who received appropriate venous thromboembolism prophylaxis within 24 hours prior to surgery to 24 hours after surgery
Patient Experience of Care Measures Evaluated by Value‐Based Purchasing in Fiscal Year 2013
Communication with nurses
Communication with doctors
Responsiveness of hospital staff
Pain management
Communication about medicines
Cleanliness and quietness of hospital environment
Discharge information
Overall rating of hospital

Participation in the program is mandatory for eligible hospitals, and CMS estimates that more than 3000 facilities across the United States will participate in FY 2013. Roughly $850 million dollars in VBP incentives will be paid out to these participating hospitals in FY 2013. The program is being financed through a 1% across‐the‐board reduction in FY 2013 diagnosis‐related group (DRG)‐based inpatient payments to participating hospitals. On December 20, 2012, CMS publically announced FY 2013 VBP incentives for all participating hospitals. Each hospital's incentive is retroactive and based on its performance between July 1, 2011 and March 31, 2012.

All data used for calculating VBP incentives is reported to CMS through its Hospital Inpatient Quality Reporting (Hospital IQR) Program, a national program instituted in 2003 that rewards hospitals for reporting designated quality measures. As of 2007, approximately 95% of eligible US hospitals were using the Hospital IQR program.[1] Measures evaluated via chart abstracts and surveys reflect a hospital's performance for its entire patient population, whereas measures assessed with claims data reflect hospital performance only for Medicare patients.

Evaluation of Hospitals

In FY 2013, hospital VBP incentive payments will be based entirely on performance in 2 domains: (1) clinical processes of care (weighted 70%) and (2) patient experience of care (weighted 30%). For each domain, CMS will evaluate each hospital's improvement over time as well as achievement compared to other hospitals in the VBP program. By assessing and rewarding both achievement and improvement, CMS will ensure that lower‐performing hospitals will still be rewarded for making substantial improvements in quality. To evaluate the first metricimprovement over timeCMS will compare a hospital's performance during a given reporting period with its baseline performance 2 years prior to this block of time. A hospital receives improvement points for improving its performance over time. To assess the second metricachievement compared to other hospitals in the VBP programCMS will compare each hospital's performance during a reporting period with the baseline performance (eg, performance 2 years prior to reporting period) of all other hospitals in the VBP program. A hospital is awarded achievement points if its performance exceeds the 50th percentile of all hospitals during the baseline performance period. Improvement scores range from 0 to 9, whereas achievement scores range from 0 to 10. The greater of a hospital's improvement and achievement scores on each VBP measure are used to calculate each hospital's total earned clinical care domain score and total earned HCAHPS base score. Hospitals that lack baseline performance data, which is required to assess improvement, will be evaluated solely on the basis of achievement points.[1] The total earned clinical care domain score is multiplied by 70% to reach the clinical care domain's contribution to a hospital's total performance score.

Each hospital's total patient experience domain, or HCAHPS performance, score consists of 2 components: a total earned HCAHPS base score as described above and a consistency score. The consistency score evaluates the reliability of a hospital's performance across all 8 patient experience of care measures (Table 3). If a hospital is above the 50th percentile of all hospital scores during the baseline period on all 8 measures, then it receives 100% of its consistency points. If a hospital is at the 0 percentile for a given measure, then it receives 0 consistency points for all measures. This provision promotes consistency by harshly penalizing hospitals with extremely poor performance on any 1 specific measure. If 1 or more measures are between the 0 and 50th percentiles, then it will receive a consistency score that takes into account how many measures were below the 50th percentile and their distance from this threshold. Each hospital's total HCAHPS performance score (the sum of total earned HCAHPS base points and consistency points) is then multiplied by 30% to arrive at the patient experience of care domain's contribution to a hospital's total performance score.

Importantly, CMS excluded from its VBP initiative 10 clinical process measures reported in the Hospital IQR Program because they are topped out; that is, almost all hospitals already perform them at very high rates (Table 4). Examples of these topped out process measures include administration of aspirin to all patients with AMI on arrival at the hospital; counseling of patients with AMI, CHF, and pneumonia about smoking cessation; and prescribing angiotensin‐converting enzyme inhibitors or angiotensin receptor blockers to patients with CHF and left ventricular dysfunction.[1]

Topped Out Measures
Disease Process Measure
  • NOTE: Abbreviations: ACEI, angiotensin‐converting enzyme inhibitor; ARB, angiotensin receptor blocker.

Acute myocardial infarction Aspirin administered on arrival to the emergency department
ACEI or ARB prescribed on discharge
Patient counseled about smoking cessation
‐Blocker prescribed on discharge
Aspirin prescribed at discharge
Heart failure Patient counseled about smoking cessation
Evaluation of left ventricular systolic function
ACEI or ARB prescribed for left ventricular systolic dysfunction
Pneumonia Patient counseled about smoking cessation
Surgical Care Improvement Project Surgery patients with appropriate hair removal

Calculation of VBP Incentives and Public Reporting

A hospital's total performance score for FY 2013 is equal to the sum of 70% of its clinical care domain score and 30% of its total HCAHPS performance score. This total performance score is entered into a linear mathematical formula to calculate each hospital's incentive payment. CMS projects that VBP will lead to a net increase in Medicare payments for one‐half of hospitals and a net decrease in payments for the other half of participating facilities.[1]

In December 2012, CMS publicly disclosed information about the initial performance of each hospital in the VBP program. Reported information included: (1) hospital performance for each applicable performance measure, (2) hospital performance by disease condition or procedure, and (3) hospital's total performance score. Initial analyses of this performance data revealed that 1557 hospitals will receive bonus payments under VBP in FY 2013, whereas 1427 hospitals will lose money under this program. Treasure Valley Hospital, a 10‐bed physician‐owned hospital in Boise, Idaho, will receive a 0.83% increase in Medicare payments, the largest payment increase under VBP in 2013. Conversely, Auburn Community Hospital in upstate New York, will suffer the most severe payment reduction: 0.9% per Medicare admission. The penalty will cost Auburn Hospital about $100,000, which is slightly more than 0.1% of its yearly $85 million operating budget.[13] For almost two‐thirds of participating hospitals, FY 2013 Medicare payments will change by <0.25%.[13] Additional information about VBP payments for FY 2013, including the number of hospitals who received VBP incentives and the size and range of these payments, is now accessible to the public through CMS' Hospital Compare Web site (http://www.hospitalcompare.hhs.gov).

CHALLENGES OF VBP

As the Medicare VBP program evolves, and hospitals confront ever‐larger financial incentives to deliver high‐value as opposed to high‐volume care, it will be important to recognize limitations of the VBP program as they arise. Here we briefly discuss several conceptual and implementation challenges that physicians and policymakers should consider when assessing the merits of VBP in promoting high‐quality healthcare.

Rigorous and Continuous Evaluation of VBP Programs

The main premise of using VBP to incentivize hospitals to deliver high‐quality cost‐effective care is that the process measures used to determine hospital quality do impact patient outcomes. However, it is already well established that improvements in measures of process quality are not always associated with improvements in patient outcomes.[14, 15, 16] Moreover, incentivizing specific process measures encourages hospitals to shift resources away from other aspects of care delivery, which may have ambiguous, or even deleterious, effects on patient outcomes. Although incentives ideally push hospitals to shift resources away from low‐quality care toward high‐quality care, in practice this is not always the case. Hospital resources may instead be drawn away from areas that are not yet incented by VBP, but for which improvements in quality of care are desperately needed. The same empirical focus behind using VBP to incentivize hospitals to improve patient outcomes efficiently should be used to evaluate whether VBP is continually meeting its stated goals: reducing overall patient morbidity and mortality and improving patient satisfaction at ideally lower cost. The experience of the US education system with public policies designed to improve student testing performance may serve as a cautionary example here. Such policies, which provide financial rewards to schools whose students perform well on standardized tests, can indeed raise testing performance. However, these policies also lead educators to teach to the test, and to neglect important topics that are not tested on standardized exams.[17]

Prioritization of Process Measures

As payment incentives for VBP currently stand, process measures are weighted equally regardless of the clinical benefits they generate and the resources required to achieve improvements in process quality. For instance, 2 process measures, continuing home ‐blocker medications for patients with coronary artery disease undergoing surgery and early percutaneous coronary intervention for patients with AMI, may be weighted equally as process measures although both their clinical benefits and the costs of implementation are very different. Some hospitals responding to VBP incentives may choose to invest in areas where their ability to earn VBP incentive payments is high and the costs of improvement are low, although those areas may not be where interventions are most needed because clinical outcomes could be most improved. Recognizing that process measures have heterogeneous benefits and costs of implementation is important when prioritizing their reimbursement in VBP.

Measuring Improvements in Hospital Quality

Tying hospital financial compensation to hospital quality implies that measures of hospital quality should be robust. To incentivize hospitals to improve quality not only relative to other hospitals but to themselves in the past, the VBP program has established a baseline performance for each hospital. Each hospital is compared to its baseline performance in subsequent evaluation periods. Thus, properly measuring a hospital's baseline performance is important. During a given baseline period, some hospitals may have better or worse outcomes than their steady state due to random variation alone. Some hospitals deemed to have a low baseline will experience improvements in quality that are not related to active efforts to improve quality but through chance alone. Similarly, some hospitals deemed to have a high baseline will experience reductions in quality through chance. Of course, neither of these changes should be subject to differences in reimbursement because they do not reflect actual organizational changes made by the hospitals. The VBP program has made significant efforts to address this issue by requiring participating hospitals to have a large enough sample of cases such that estimated rates of process quality adherence meet a reliability threshold (ie, are likely to be consistent over time rather than vary substantially through chance alone). However, not all process measures exhibit high reliability, particularly those for which adverse events are rare (eg, foreign objects retained after surgery, air embolisms, and blood incompatibility). Ultimately, CMS's decision to balance the need for statistically reliable data with the goal of including as many hospitals as possible in the VBP program will require ongoing reevaluation of this issue.

Choosing Hospital Comparators Appropriately

In the current VBP program, hospitals will be evaluated in part by how they compare to hospitals nationally. However, studies of regional variation in healthcare have demonstrated large variations in practice patterns across the United States,[18, 19, 20] raising the question of whether hospitals should, at least initially, be compared to hospitals in the same geographic area. Although the ultimate goal of VBP should be to hold hospitals to a national standard, local practice patterns are not easily modified within 1‐ to 2‐year timeframes. Initially comparing hospitals to a national rather than local standard may unfairly penalize hospitals that are relative underperformers nationally but overperformers regionally. Although CMS's policy to reward improvement within hospitals over time mitigates issues arising from a cross‐sectional comparison of hospitals, the issue still remains if many hospitals within a region not only underperform relative to other hospitals nationally but also fail to demonstrate improvement. More broadly, this issue extends to differences across hospitals in factors that impact their ability to meet VBP goals. These factors may include, for example, hospital size, profitability, patient case and insurance mix, and presence of an electronic medical record. Comparing hospitals with vastly different abilities to achieve VBP goals and improve quickly may amount to inequitable policy.

Continual Evaluation of Topped‐Out Measures

Process measures that are met at high rates at nearly all hospitals are not used in evaluations by CMS for VBP. An assumption underlying CMS' decision to not reward hospitals for achieving these topped‐out measures is that once physicians and hospitals make cognitive and system‐level improvements that improve process quality, these gains will persist after the incentive is removed. Thus, CMS hopes and anticipates that although performance incentives will make it easier for well‐meaning physicians to learn to do the right thing, doctors will continue to do the right things for patients after these incentives are removed.[21, 22] Although this assumption may generally be accurate, it is important to continue to evaluate whether measures that are currently topped out continue to remain adequately performed, because rewarding new quality measures will necessarily lead hospitals to reallocate resources away from other clinical activities. Although we hope that the continued public reporting of topped‐out measures will prevent declines in performance on these measures, policy makers and clinicians should be aware that the lack of financial incentives for topped‐out measures may result in declines in quality. To this point, an analysis of 35 Kaiser Permanente facilities from 1997 to 2007 demonstrated that the removal of financial incentives for diabetic retinopathy and cervical cancer screening was associated with subsequent declines in performance of 3% and 1.6% per year, respectively.[23]

Will VBP Incentives Be Large Enough to Change Practice Patterns?

The VBP Program's ability to influence change depends, at least in part, on how the incentives offered under this program compare to the magnitude of the investments that hospitals must make to achieve a given reward. In general, larger incentives are necessary to motivate more significant changes in behavior or to influence organizations to invest the resources needed to achieve change. The incentives offered under VBP in FY 2013 are quite modest. Almost two‐thirds of participating hospitals will see their FY 2013 Medicare revenues change by <0.25%, roughly $125,000 at most.[13, 24] Although these incentives may motivate hospitals that can improve performance and achievement with very modest investments, they may have little impact on organizations that need to make significant upfront investments in care processes to achieve sustainable improvements in care quality. As CMS increases the size of VBP incentives over the next 2 to 4 years, it will also hold hospitals accountable for a broader and increasingly complex set of outcomes. Improving these outcomes may require investments in areas such as information technology and process improvement that far surpass the VBP incentive reward.

Moreover, prior research suggests that financial incentives like those available under VBP may contribute only slightly to performance improvements when public reporting already exists. For example, in a 2‐year study of 613 US hospitals implementing pay‐for‐performance plus public reporting or public reporting only, pay for performance plus public reporting was associated with only a 2.6% to 4.1% increase in a composite measure of quality when compared to hospitals with public reporting only.[9] Similarly, a study of 54 hospitals participating in the CMS pay for performance pilot initiative found no significant improvement in quality of care or outcomes for AMI when compared to 446 control hospitals.[25] A long‐term analysis of pay for performance in the Medicare Premier Hospital Quality Incentive Demonstration found that participation in the program had no discernible effect on 30‐day mortality rates.[10] Finally, a study of physician medical groups contracting with a large network healthcare maintenance organization found that the implementation of pay for performance did not result in major before and after improvements in clinical quality compared to a control group of medical groups.[26]

High‐Value Care Is Not Always Low‐Cost Care

Not surprisingly, the clinical process measures included in CMS' hospital VBP program evaluate a select and relatively small group of high‐value and low‐cost interventions (eg, appropriate administration of antibiotics and tight control of serum glucose in surgical patients). However, an important body of work has demonstrated that high‐cost care (eg, intensive inpatient hospital care for common acute medical conditions) may also be highly valuable in terms of improving survival.[20, 27, 28, 29, 30] As the hospital VBP program evolves, its overseers will need to consider whether to include additional incentives for high‐value high‐cost healthcare services. Such considerations will likely become increasingly salient as healthcare delivery organizations move toward capitated delivery models. In particular, the VBP program's Medicare Spending Per Beneficiary measure, which quantifies inpatient and subsequent outpatient spending per beneficiary after a given hospitalization episode, will need to distinguish between higher‐spending hospitals that provide highly effective care (eg, care that reduces mortality and readmissions) and facilities that provide less‐effective care.

FUTURE OF VBP

Although the future of VBP is unknown, CMS is likely to modify the program in a number of ways over the next 3 to 5 years. First, CMS will likely expand the breadth and focus of incentivized measures in the VBP program. In FY 2014, for example, CMS is adding a set of 3, 30‐day mortality outcome measures to VBP: 30‐day risk‐adjusted mortality for AMI, CHF, and pneumonia.[1] A hospital's performance with respect to these outcomes will represent 25% of its total performance score in 2014, whereas the clinical process of care and patient experience of care domains will account for 45% and 30% of this score, respectively. In 2015, patient experience and outcome measures will account for 30% each in a hospital's performance score, whereas process and efficiency measures will each account for 20% of this score, respectively. The composition of this performance score evidences a shift away from rewarding process‐based measures and toward incentivizing measures of clinical outcomes and patient satisfaction, the latter of which may be highly subjective and more representative of a hospital's catchment population than of a hospital's care itself.[31] Additional measures in the domains of patient safety, care coordination, population and community health, emergency room wait times, and cost control may also be added to the VBP program in FY 2015 to FY 2017. Furthermore, CMS will continue to reevaluate the appropriateness of measures that are already included in VBP and will stop incentivizing measures that have become topped out, or are no longer supported by the National Quality Forum.[1, 13]

Second, CMS has established an annual gradual increase of 0.25% in the percentage of each hospital's inpatient DRG‐based payment that is at stake under VBP. In FY 2014, for example, participating hospitals will be required to contribute 1.25% of inpatient DRG payments to the VBP program. This percentage is likely to increase to 2% or more by 2017.[1, 32]

Third, expansions of the VBP program complement a number of other quality improvement efforts overseen by CMS, including the Hospital Readmissions Reduction Program. Effective for discharges beginning on October 1, 2012, hospitals with excess readmissions for AMI, CHF, and pneumonia are at risk for reimbursement reductions for all Medicare admissions in proportion to the rate of excess rehospitalizations. Some of the same concerns about the hospital VBP program outlined above have also been raised for this program, namely, whether readmission penalties will be large enough to impact hospital behavior, whether readmissions are even preventable,[33, 34] and whether adjustments in hospital‐level policies will reduce admissions that are known to be heavily influenced by patient economic and social factors that are outside of a hospital's control.[35, 36] Despite the limitations of VBP and the challenges that lie ahead, there is optimism that rewarding hospitals that provide high‐value rather than high‐volume care will not only improve outcomes of hospitalized patients in the United States, but will potentially be able to do so at a lower cost. Encouraging hospitals to improve their quality of care may also have important spillover effects on other healthcare domains. For example, hospitals that adopt systems to ensure prompt delivery of antibiotics to patients with pneumonia may also observe positive spillover effects with the prompt antibiotic management of other acute infectious illnesses that are not covered by VBP. VBP may have spillover effects on medical malpractice liability and defensive medicine as well. Indeed, financial incentives to practice higher‐quality evidenced‐based care may reduce medical malpractice liability and defensive medicine.

The government's ultimate goal in implementing VBP is to identify a broad and clinically relevant set of outcome measures that can be used to incentivize hospitals to deliver high‐quality as opposed to high‐volume healthcare. The first wave of outcome measures has already been instituted. It remains to be seen whether the incentive rewards of Medicare's hospital VBP program will be large enough that hospitals feel compelled to improve and compete for them.

Files
References
  1. Centers for Medicare and Medicaid Services. Hospital Value‐Based Purchasing Web site. 2013. Available at: http://www.cms.gov/Medicare/Quality‐Initiatives‐Patient‐Assessment‐Instruments/hospital‐value‐based‐purchasing/index.html. Accessed March 4, 2013.
  2. VanLare JM, Conway PH. Value‐based purchasing—national programs to move from volume to value. N Engl J Med. 2012;367:292295.
  3. Joynt KE, Rosenthal MB. Hospital value‐based purchasing: will Medicare's new policy exacerbate disparities? Circ Cardiovasc Qual Outcomes. 2012;5:148149.
  4. Centers for Medicare and Medicaid Services. CMS/premier hospital quality incentive demonstration (QHID). 2013. Available at: https://www.premierinc.com/quality‐safety/tools‐services/p4p/hqi/faqs.jsp. Accessed March 5, 2013.
  5. Centers for Medicare and Medicaid Services. Hospital Compare Web site. 2013. Available at: http://www.medicare.gov/hospitalcompare. Accessed March 4, 2013.
  6. Brown J, Doloresco F, Mylotte JM. “Never events”: not every hospital‐acquired infection is preventable. Clin Infect Dis. 2009;49:743746.
  7. Epstein AM. Will pay for performance improve quality of care? The answer is in the details. N Engl J Med. 2012;367:18521853.
  8. Sutton M, Nikolova S, Boaden R, Lester H, McDonald R, Roland M. Reduced mortality with hospital pay for performance in England. N Engl J Med. 2012;367:18211828.
  9. Lindenauer PK, Remus D, Roman S, et al. Public reporting and pay for performance in hospital quality improvement. N Engl J Med. 2007;356:486496.
  10. Jha AK, Joynt KE, Orav EJ, Epstein AM. The long‐term effect of premier pay for performance on patient outcomes. N Engl J Med. 2012;366:16061615.
  11. Houle SK, McAlister FA, Jackevicius CA, Chuck AW, Tsuyuki RT. Does performance‐based remuneration for individual health care practitioners affect patient care?: a systematic review. Ann Intern Med. 2012;157:889899.
  12. Centers for Medicare and Medicaid Services. Hospital Consumer Assessment Of Healthcare Providers and Systems Web site. 2013. Available at: http://www.hcahpsonline.org. Accessed March 5, 2013.
  13. Rau J. Medicare discloses hospitals' bonuses, penalties based on quality. Kaiser Health News. December 20, 2012. Available at: http://www.kaiserhealthnews.org/stories/2012/december/21/medicare‐hospitals‐value‐based‐purchasing.aspx?referrer=search. Accessed March 26, 2013.
  14. Yasaitis L, Fisher ES, Skinner JS, Chandra A. Hospital quality and intensity of spending: is there an association? Health Aff (Millwood). 2009;28:w566w572.
  15. Fonarow GC, Abraham WT, Albert NM, et al. Association between performance measures and clinical outcomes for patients hospitalized with heart failure. JAMA. 2007;297:6170.
  16. Rubin HR, Pronovost P, Diette GB. The advantages and disadvantages of process‐based measures of health care quality. Int J Qual Health Care. 2001;13:469474.
  17. Jacob BA. Accountability, incentives and behavior: the impact of high‐stakes testing in the Chicago public schools. J Public Econ. 2005;89:761796.
  18. Fisher ES, Wennberg DE, Stukel TA, Gottlieb DJ, Lucas FL, Pinder EL. The implications of regional variations in Medicare spending. Part 1: the content, quality, and accessibility of care. Ann Intern Med. 2003;138:273287.
  19. Fisher ES. Medical care—is more always better? N Engl J Med. 2003;349:16651667.
  20. Romley JA, Jena AB, Goldman DP. Hospital spending and inpatient mortality: evidence from California: an observational study. Ann Intern Med. 2011;154:160167.
  21. James BC. Making it easy to do it right. N Engl J Med. 2001;345:991993.
  22. Christensen RD, Henry E, Ilstrup S, Baer VL. A high rate of compliance with neonatal intensive care unit transfusion guidelines persists even after a program to improve transfusion guideline compliance ended. Transfusion. 2011;51:25192520.
  23. Lester H, Schmittdiel J, Selby J, et al. The impact of removing financial incentives from clinical quality indicators: longitudinal analysis of four Kaiser Permanente indicators. BMJ. 2010;340:c1898.
  24. Werner RM, Dudley RA. Medicare's new hospital value‐based purchasing program is likely to have only a small impact on hospital payments. Health Aff (Millwood). 2012;31:19321940.
  25. Glickman SW, Ou FS, DeLong ER, et al. Pay for performance, quality of care, and outcomes in acute myocardial infarction. JAMA. 2007;297:23732380.
  26. Mullen KJ, Frank RG, Rosenthal MB. Can you get what you pay for? Pay‐for‐performance and the quality of healthcare providers. Rand J Econ. 2010;41:6491.
  27. Romley JA, Jena AB, O'Leary JF, Goldman DP. Spending and mortality in US acute care hospitals. Am J Manag Care. 2013;19:e46e54.
  28. Barnato AE, Farrell MH, Chang CC, Lave JR, Roberts MS, Angus DC. Development and validation of hospital “end‐of‐life” treatment intensity measures. Med Care. 2009;47:10981105.
  29. Ong MK, Mangione CM, Romano PS, et al. Looking forward, looking back: assessing variations in hospital resource use and outcomes for elderly patients with heart failure. Circ Cardiovasc Qual Outcomes. 2009;2:548557.
  30. Stukel TA, Fisher ES, Alter DA, et al. Association of hospital spending intensity with mortality and readmission rates in Ontario hospitals. JAMA. 2012;307:10371045.
  31. Young GJ, Meterko M, Desai KR. Patient satisfaction with hospital care: effects of demographic and institutional characteristics. Med Care. 2000;38:325334.
  32. VanLare JM, Blum JD, Conway PH. Linking performance with payment: implementing the Physician Value‐Based Payment Modifier. JAMA. 2012;308:20892090.
  33. Walraven C, Bennett C, Jennings A, Austin PC, Forster AJ. Proportion of hospital readmissions deemed avoidable: a systematic review. CMAJ. 2011;183:E391E402.
  34. Walraven C, Jennings A, Taljaard M, et al. Incidence of potentially avoidable urgent readmissions and their relation to all‐cause urgent readmissions. CMAJ. 2011;183:E1067E1072.
  35. Joynt KE, Jha AK. Thirty‐day readmissions—truth and consequences. N Engl J Med. 2012;366:13661369.
  36. Joynt KE, Orav EJ, Jha AK. Thirty‐day readmission rates for Medicare beneficiaries by race and site of care. JAMA. 2011;305:675681.
Article PDF
Issue
Journal of Hospital Medicine - 8(5)
Publications
Page Number
271-277
Sections
Files
Files
Article PDF
Article PDF

The Centers for Medicaid and Medicare Services' (CMS) Hospital Inpatient Value‐Based Purchasing (VBP) Program, which was signed into law as part of the Patient Protection and Affordable Care Act of 2010, aims to incentivize inpatient providers to deliver high‐value, as opposed to high‐volume, healthcare.[1] Beginning on October 1, 2012, the start of the 2013 fiscal year (FY), hospitals participating in the VBP program became eligible for a variety of performance‐based incentive payments from CMS. These payments are based on an acute care hospital's ability to meet performance measurements in 6 care domains: (1) patient safety, (2) care coordination, (3) clinical processes and outcomes, (4) population or community health, (5) efficiency and cost reduction, and (6) patient‐ and caregiver‐centered experience.[2] The VBP program's ultimate purpose is to enable CMS to improve the health of Medicare beneficiaries by purchasing better care for them at a lower cost. These 3 characteristics of careimproved health, improved care, and lower costsare the foundation of CMS' conception of value.[1, 2] They are closely related to an economic conception of value, which is the difference between an intervention's benefit and its cost.

Although in principle not a new idea, the formal mandate of hospitals to provide high‐value healthcare through financial incentives marks an important change in Medicare and Medicaid policy. In this opportune review of VBP, we first discuss the relevant historical changes in the reimbursement environment of US hospitals that have set the stage for VBP. We then describe the structure of CMS' VBP program, with a focus on which facilities are eligible to participate in the program, the specific outcomes measured and incentivized, how rewards and penalties are allocated, and how the program will be funded. In an effort to anticipate some of the issues that lie ahead, we then highlight a number of potential challenges to the success of VBP, and discuss how VBP will impact the delivery and reimbursement of inpatient care services. We conclude by examining how the VBP program is likely to evolve over time.

HISTORICAL CONTEXT FOR VBP

Over the last decade, CMS has embarked on a number of initiatives to incentivize the provision of higher‐quality and more cost‐effective care. For example, in 2003, CMS implemented a national pay‐for‐performance (P4P) pilot project called the Premier Hospital Quality Incentive Demonstration (HQID).[3, 4] HQID, which ran for 6 years, tracked and rewarded the performance of 216 hospitals in 6 healthcare service domains: (1) acute myocardial infarction (AMI), (2) congestive heart failure (CHF), (3) pneumonia, (4) coronary artery bypass graft surgery, (5) hip and knee replacement surgery, and (6) perioperative management of surgical patients (including prevention of surgical site infections).[4] CMS then introduced its Hospital Compare Web site in 2005 to facilitate public reporting of hospital‐level quality outcomes.[3, 5] This Web site provides the public with access to data on hospital performance across a wide array of measures of process quality, clinical outcomes, spending, and resource utilization.[5] Next, in October 2008, CMS stopped reimbursing hospitals for a number of costly and common hospital‐acquired complications, including hospital‐acquired bloodstream infections and urinary tract infections, patient falls, and pressure ulcers.[3, 6] VBP is the latest and most comprehensive step that CMS has taken in its decade‐long effort to shift from volume to value‐based compensation for inpatient care.

Although CMS appears fully invested in using performance incentives to increase healthcare value, existing evidence of the effects of P4P on patient outcomes remains quite mixed.[7] On one hand, an analysis of an inpatient P4P program sponsored by the United Kingdom's National Health Service's (NHS) suggests that P4P may improve quality and save lives; indeed, hospitals that participated in the NHS P4P program significantly reduced inpatient mortality from pneumonia, saving an estimated 890 lives.[8] Additional empirical work suggests that the HQID was also associated with early improvements in healthcare quality.[9] However, a subsequent long‐term analysis found that participation in HQID had no discernible effect on 30‐day mortality rates.[10] Moreover, a meta‐analysis of P4P incentives for individual practitioners found few methodologically robust studies of P4P for clinicians and concluded that P4P's effects on individual practice patterns and outcomes remain largely uncertain.[11]

VBP: STRUCTURE AND DESIGN

This section reviews the structure of the VBP program. We describe current VBP eligibility criteria and sources of funding for the program, how hospitals participating in VBP are evaluated, and how VBP incentives for FY 2013 have been calculated.

Hospital Eligibility for VBP

All acute care hospitals in the United States (excluding Maryland) that are not psychiatric hospitals, rehabilitation hospitals, long‐term care facilities, children's hospitals, or cancer hospitals are eligible to participate in VBP in FY 2013 (full eligibility criteria is outlined in Table 1). For FY 2013, CMS chose to incentivize measures in just 2 care domains: (1) clinical processes of care and (2) patient experience of care. To be eligible for VBP in FY 2013, a hospital must report at least 10 cases each in at least 4 of 12 measures included in the clinical processes of care domain (Table 2), and/or must have at least 100 completed Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS). Designed and validated by CMS, the HCAHPS survey provides hospitals with a standardized instrument for gathering information about patient satisfaction with, and perspectives on, their hospital care.[12] HCAHPS will be used to assess 8 patient experience of care measures (Table 3).

Inclusion and Exclusion Criteria for the Inpatient Value‐Based Purchasing Program in Fiscal Year 2013
  • NOTE: Abbreviations: HCAHPS, Hospital Consumer Assessment of Healthcare Providers and Systems; HHS, US Department of Health and Human Services; VBP, Value‐Based Purchasing.

Inclusion criteria
Acute care hospital
Located in all 50 US states or District of Columbia (excluding Maryland)
Has at least 10 cases in at least 4 of 12 clinical process of care measures and/or at least 100 completed HCAHPS surveys
Exclusion criteria
Psychiatric, rehabilitation, long‐term care, children's or cancer hospital
Does not participate in Hospital Inpatient Quality Reporting Program during the VBP performance period
Cited by the Secretary of HHS for significant patient safety violations during performance period
Hospital does not meet minimum reporting requirements for number of cases, process measures, and surveys needed to participate in VBP
Clinical Process of Care Measures Evaluated by Value‐Based Purchasing in Fiscal Year 2013
Disease Process Process of Care Measure
  • NOTE: Mortality measures to be added in fiscal year 2014: acute myocardial infarction, congestive heart failure, pneumonia.

Acute myocardial infarction Fibrinolytic therapy received within 30 minutes of hospital arrival
Primary percutaneous coronary intervention received within 90 minutes of hospital arrival
Heart failure Discharge instructions provided
Pneumonia Blood cultures performed in the emergency department prior to initial antibiotic received in hospital
Initial antibiotic selection for community‐acquired pneumonia in immunocompetent patient
Healthcare‐associated infections Prophylactic antibiotic received within 1 hour prior to surgical incision
Prophylactic antibiotic selection for surgical patients
Prophylactic antibiotics discontinued within 24 hours after surgery ends
Cardiac surgery patients with controlled 6:00 am postoperative serum glucose
Surgeries Surgery patients on ‐blocker prior to arrival that received ‐blocker during perioperative period
Surgery patients with recommended venous thromboembolism prophylaxis ordered
Surgery patients who received appropriate venous thromboembolism prophylaxis within 24 hours prior to surgery to 24 hours after surgery
Patient Experience of Care Measures Evaluated by Value‐Based Purchasing in Fiscal Year 2013
Communication with nurses
Communication with doctors
Responsiveness of hospital staff
Pain management
Communication about medicines
Cleanliness and quietness of hospital environment
Discharge information
Overall rating of hospital

Participation in the program is mandatory for eligible hospitals, and CMS estimates that more than 3000 facilities across the United States will participate in FY 2013. Roughly $850 million dollars in VBP incentives will be paid out to these participating hospitals in FY 2013. The program is being financed through a 1% across‐the‐board reduction in FY 2013 diagnosis‐related group (DRG)‐based inpatient payments to participating hospitals. On December 20, 2012, CMS publically announced FY 2013 VBP incentives for all participating hospitals. Each hospital's incentive is retroactive and based on its performance between July 1, 2011 and March 31, 2012.

All data used for calculating VBP incentives is reported to CMS through its Hospital Inpatient Quality Reporting (Hospital IQR) Program, a national program instituted in 2003 that rewards hospitals for reporting designated quality measures. As of 2007, approximately 95% of eligible US hospitals were using the Hospital IQR program.[1] Measures evaluated via chart abstracts and surveys reflect a hospital's performance for its entire patient population, whereas measures assessed with claims data reflect hospital performance only for Medicare patients.

Evaluation of Hospitals

In FY 2013, hospital VBP incentive payments will be based entirely on performance in 2 domains: (1) clinical processes of care (weighted 70%) and (2) patient experience of care (weighted 30%). For each domain, CMS will evaluate each hospital's improvement over time as well as achievement compared to other hospitals in the VBP program. By assessing and rewarding both achievement and improvement, CMS will ensure that lower‐performing hospitals will still be rewarded for making substantial improvements in quality. To evaluate the first metricimprovement over timeCMS will compare a hospital's performance during a given reporting period with its baseline performance 2 years prior to this block of time. A hospital receives improvement points for improving its performance over time. To assess the second metricachievement compared to other hospitals in the VBP programCMS will compare each hospital's performance during a reporting period with the baseline performance (eg, performance 2 years prior to reporting period) of all other hospitals in the VBP program. A hospital is awarded achievement points if its performance exceeds the 50th percentile of all hospitals during the baseline performance period. Improvement scores range from 0 to 9, whereas achievement scores range from 0 to 10. The greater of a hospital's improvement and achievement scores on each VBP measure are used to calculate each hospital's total earned clinical care domain score and total earned HCAHPS base score. Hospitals that lack baseline performance data, which is required to assess improvement, will be evaluated solely on the basis of achievement points.[1] The total earned clinical care domain score is multiplied by 70% to reach the clinical care domain's contribution to a hospital's total performance score.

Each hospital's total patient experience domain, or HCAHPS performance, score consists of 2 components: a total earned HCAHPS base score as described above and a consistency score. The consistency score evaluates the reliability of a hospital's performance across all 8 patient experience of care measures (Table 3). If a hospital is above the 50th percentile of all hospital scores during the baseline period on all 8 measures, then it receives 100% of its consistency points. If a hospital is at the 0 percentile for a given measure, then it receives 0 consistency points for all measures. This provision promotes consistency by harshly penalizing hospitals with extremely poor performance on any 1 specific measure. If 1 or more measures are between the 0 and 50th percentiles, then it will receive a consistency score that takes into account how many measures were below the 50th percentile and their distance from this threshold. Each hospital's total HCAHPS performance score (the sum of total earned HCAHPS base points and consistency points) is then multiplied by 30% to arrive at the patient experience of care domain's contribution to a hospital's total performance score.

Importantly, CMS excluded from its VBP initiative 10 clinical process measures reported in the Hospital IQR Program because they are topped out; that is, almost all hospitals already perform them at very high rates (Table 4). Examples of these topped out process measures include administration of aspirin to all patients with AMI on arrival at the hospital; counseling of patients with AMI, CHF, and pneumonia about smoking cessation; and prescribing angiotensin‐converting enzyme inhibitors or angiotensin receptor blockers to patients with CHF and left ventricular dysfunction.[1]

Topped Out Measures
Disease Process Measure
  • NOTE: Abbreviations: ACEI, angiotensin‐converting enzyme inhibitor; ARB, angiotensin receptor blocker.

Acute myocardial infarction Aspirin administered on arrival to the emergency department
ACEI or ARB prescribed on discharge
Patient counseled about smoking cessation
‐Blocker prescribed on discharge
Aspirin prescribed at discharge
Heart failure Patient counseled about smoking cessation
Evaluation of left ventricular systolic function
ACEI or ARB prescribed for left ventricular systolic dysfunction
Pneumonia Patient counseled about smoking cessation
Surgical Care Improvement Project Surgery patients with appropriate hair removal

Calculation of VBP Incentives and Public Reporting

A hospital's total performance score for FY 2013 is equal to the sum of 70% of its clinical care domain score and 30% of its total HCAHPS performance score. This total performance score is entered into a linear mathematical formula to calculate each hospital's incentive payment. CMS projects that VBP will lead to a net increase in Medicare payments for one‐half of hospitals and a net decrease in payments for the other half of participating facilities.[1]

In December 2012, CMS publicly disclosed information about the initial performance of each hospital in the VBP program. Reported information included: (1) hospital performance for each applicable performance measure, (2) hospital performance by disease condition or procedure, and (3) hospital's total performance score. Initial analyses of this performance data revealed that 1557 hospitals will receive bonus payments under VBP in FY 2013, whereas 1427 hospitals will lose money under this program. Treasure Valley Hospital, a 10‐bed physician‐owned hospital in Boise, Idaho, will receive a 0.83% increase in Medicare payments, the largest payment increase under VBP in 2013. Conversely, Auburn Community Hospital in upstate New York, will suffer the most severe payment reduction: 0.9% per Medicare admission. The penalty will cost Auburn Hospital about $100,000, which is slightly more than 0.1% of its yearly $85 million operating budget.[13] For almost two‐thirds of participating hospitals, FY 2013 Medicare payments will change by <0.25%.[13] Additional information about VBP payments for FY 2013, including the number of hospitals who received VBP incentives and the size and range of these payments, is now accessible to the public through CMS' Hospital Compare Web site (http://www.hospitalcompare.hhs.gov).

CHALLENGES OF VBP

As the Medicare VBP program evolves, and hospitals confront ever‐larger financial incentives to deliver high‐value as opposed to high‐volume care, it will be important to recognize limitations of the VBP program as they arise. Here we briefly discuss several conceptual and implementation challenges that physicians and policymakers should consider when assessing the merits of VBP in promoting high‐quality healthcare.

Rigorous and Continuous Evaluation of VBP Programs

The main premise of using VBP to incentivize hospitals to deliver high‐quality cost‐effective care is that the process measures used to determine hospital quality do impact patient outcomes. However, it is already well established that improvements in measures of process quality are not always associated with improvements in patient outcomes.[14, 15, 16] Moreover, incentivizing specific process measures encourages hospitals to shift resources away from other aspects of care delivery, which may have ambiguous, or even deleterious, effects on patient outcomes. Although incentives ideally push hospitals to shift resources away from low‐quality care toward high‐quality care, in practice this is not always the case. Hospital resources may instead be drawn away from areas that are not yet incented by VBP, but for which improvements in quality of care are desperately needed. The same empirical focus behind using VBP to incentivize hospitals to improve patient outcomes efficiently should be used to evaluate whether VBP is continually meeting its stated goals: reducing overall patient morbidity and mortality and improving patient satisfaction at ideally lower cost. The experience of the US education system with public policies designed to improve student testing performance may serve as a cautionary example here. Such policies, which provide financial rewards to schools whose students perform well on standardized tests, can indeed raise testing performance. However, these policies also lead educators to teach to the test, and to neglect important topics that are not tested on standardized exams.[17]

Prioritization of Process Measures

As payment incentives for VBP currently stand, process measures are weighted equally regardless of the clinical benefits they generate and the resources required to achieve improvements in process quality. For instance, 2 process measures, continuing home ‐blocker medications for patients with coronary artery disease undergoing surgery and early percutaneous coronary intervention for patients with AMI, may be weighted equally as process measures although both their clinical benefits and the costs of implementation are very different. Some hospitals responding to VBP incentives may choose to invest in areas where their ability to earn VBP incentive payments is high and the costs of improvement are low, although those areas may not be where interventions are most needed because clinical outcomes could be most improved. Recognizing that process measures have heterogeneous benefits and costs of implementation is important when prioritizing their reimbursement in VBP.

Measuring Improvements in Hospital Quality

Tying hospital financial compensation to hospital quality implies that measures of hospital quality should be robust. To incentivize hospitals to improve quality not only relative to other hospitals but to themselves in the past, the VBP program has established a baseline performance for each hospital. Each hospital is compared to its baseline performance in subsequent evaluation periods. Thus, properly measuring a hospital's baseline performance is important. During a given baseline period, some hospitals may have better or worse outcomes than their steady state due to random variation alone. Some hospitals deemed to have a low baseline will experience improvements in quality that are not related to active efforts to improve quality but through chance alone. Similarly, some hospitals deemed to have a high baseline will experience reductions in quality through chance. Of course, neither of these changes should be subject to differences in reimbursement because they do not reflect actual organizational changes made by the hospitals. The VBP program has made significant efforts to address this issue by requiring participating hospitals to have a large enough sample of cases such that estimated rates of process quality adherence meet a reliability threshold (ie, are likely to be consistent over time rather than vary substantially through chance alone). However, not all process measures exhibit high reliability, particularly those for which adverse events are rare (eg, foreign objects retained after surgery, air embolisms, and blood incompatibility). Ultimately, CMS's decision to balance the need for statistically reliable data with the goal of including as many hospitals as possible in the VBP program will require ongoing reevaluation of this issue.

Choosing Hospital Comparators Appropriately

In the current VBP program, hospitals will be evaluated in part by how they compare to hospitals nationally. However, studies of regional variation in healthcare have demonstrated large variations in practice patterns across the United States,[18, 19, 20] raising the question of whether hospitals should, at least initially, be compared to hospitals in the same geographic area. Although the ultimate goal of VBP should be to hold hospitals to a national standard, local practice patterns are not easily modified within 1‐ to 2‐year timeframes. Initially comparing hospitals to a national rather than local standard may unfairly penalize hospitals that are relative underperformers nationally but overperformers regionally. Although CMS's policy to reward improvement within hospitals over time mitigates issues arising from a cross‐sectional comparison of hospitals, the issue still remains if many hospitals within a region not only underperform relative to other hospitals nationally but also fail to demonstrate improvement. More broadly, this issue extends to differences across hospitals in factors that impact their ability to meet VBP goals. These factors may include, for example, hospital size, profitability, patient case and insurance mix, and presence of an electronic medical record. Comparing hospitals with vastly different abilities to achieve VBP goals and improve quickly may amount to inequitable policy.

Continual Evaluation of Topped‐Out Measures

Process measures that are met at high rates at nearly all hospitals are not used in evaluations by CMS for VBP. An assumption underlying CMS' decision to not reward hospitals for achieving these topped‐out measures is that once physicians and hospitals make cognitive and system‐level improvements that improve process quality, these gains will persist after the incentive is removed. Thus, CMS hopes and anticipates that although performance incentives will make it easier for well‐meaning physicians to learn to do the right thing, doctors will continue to do the right things for patients after these incentives are removed.[21, 22] Although this assumption may generally be accurate, it is important to continue to evaluate whether measures that are currently topped out continue to remain adequately performed, because rewarding new quality measures will necessarily lead hospitals to reallocate resources away from other clinical activities. Although we hope that the continued public reporting of topped‐out measures will prevent declines in performance on these measures, policy makers and clinicians should be aware that the lack of financial incentives for topped‐out measures may result in declines in quality. To this point, an analysis of 35 Kaiser Permanente facilities from 1997 to 2007 demonstrated that the removal of financial incentives for diabetic retinopathy and cervical cancer screening was associated with subsequent declines in performance of 3% and 1.6% per year, respectively.[23]

Will VBP Incentives Be Large Enough to Change Practice Patterns?

The VBP Program's ability to influence change depends, at least in part, on how the incentives offered under this program compare to the magnitude of the investments that hospitals must make to achieve a given reward. In general, larger incentives are necessary to motivate more significant changes in behavior or to influence organizations to invest the resources needed to achieve change. The incentives offered under VBP in FY 2013 are quite modest. Almost two‐thirds of participating hospitals will see their FY 2013 Medicare revenues change by <0.25%, roughly $125,000 at most.[13, 24] Although these incentives may motivate hospitals that can improve performance and achievement with very modest investments, they may have little impact on organizations that need to make significant upfront investments in care processes to achieve sustainable improvements in care quality. As CMS increases the size of VBP incentives over the next 2 to 4 years, it will also hold hospitals accountable for a broader and increasingly complex set of outcomes. Improving these outcomes may require investments in areas such as information technology and process improvement that far surpass the VBP incentive reward.

Moreover, prior research suggests that financial incentives like those available under VBP may contribute only slightly to performance improvements when public reporting already exists. For example, in a 2‐year study of 613 US hospitals implementing pay‐for‐performance plus public reporting or public reporting only, pay for performance plus public reporting was associated with only a 2.6% to 4.1% increase in a composite measure of quality when compared to hospitals with public reporting only.[9] Similarly, a study of 54 hospitals participating in the CMS pay for performance pilot initiative found no significant improvement in quality of care or outcomes for AMI when compared to 446 control hospitals.[25] A long‐term analysis of pay for performance in the Medicare Premier Hospital Quality Incentive Demonstration found that participation in the program had no discernible effect on 30‐day mortality rates.[10] Finally, a study of physician medical groups contracting with a large network healthcare maintenance organization found that the implementation of pay for performance did not result in major before and after improvements in clinical quality compared to a control group of medical groups.[26]

High‐Value Care Is Not Always Low‐Cost Care

Not surprisingly, the clinical process measures included in CMS' hospital VBP program evaluate a select and relatively small group of high‐value and low‐cost interventions (eg, appropriate administration of antibiotics and tight control of serum glucose in surgical patients). However, an important body of work has demonstrated that high‐cost care (eg, intensive inpatient hospital care for common acute medical conditions) may also be highly valuable in terms of improving survival.[20, 27, 28, 29, 30] As the hospital VBP program evolves, its overseers will need to consider whether to include additional incentives for high‐value high‐cost healthcare services. Such considerations will likely become increasingly salient as healthcare delivery organizations move toward capitated delivery models. In particular, the VBP program's Medicare Spending Per Beneficiary measure, which quantifies inpatient and subsequent outpatient spending per beneficiary after a given hospitalization episode, will need to distinguish between higher‐spending hospitals that provide highly effective care (eg, care that reduces mortality and readmissions) and facilities that provide less‐effective care.

FUTURE OF VBP

Although the future of VBP is unknown, CMS is likely to modify the program in a number of ways over the next 3 to 5 years. First, CMS will likely expand the breadth and focus of incentivized measures in the VBP program. In FY 2014, for example, CMS is adding a set of 3, 30‐day mortality outcome measures to VBP: 30‐day risk‐adjusted mortality for AMI, CHF, and pneumonia.[1] A hospital's performance with respect to these outcomes will represent 25% of its total performance score in 2014, whereas the clinical process of care and patient experience of care domains will account for 45% and 30% of this score, respectively. In 2015, patient experience and outcome measures will account for 30% each in a hospital's performance score, whereas process and efficiency measures will each account for 20% of this score, respectively. The composition of this performance score evidences a shift away from rewarding process‐based measures and toward incentivizing measures of clinical outcomes and patient satisfaction, the latter of which may be highly subjective and more representative of a hospital's catchment population than of a hospital's care itself.[31] Additional measures in the domains of patient safety, care coordination, population and community health, emergency room wait times, and cost control may also be added to the VBP program in FY 2015 to FY 2017. Furthermore, CMS will continue to reevaluate the appropriateness of measures that are already included in VBP and will stop incentivizing measures that have become topped out, or are no longer supported by the National Quality Forum.[1, 13]

Second, CMS has established an annual gradual increase of 0.25% in the percentage of each hospital's inpatient DRG‐based payment that is at stake under VBP. In FY 2014, for example, participating hospitals will be required to contribute 1.25% of inpatient DRG payments to the VBP program. This percentage is likely to increase to 2% or more by 2017.[1, 32]

Third, expansions of the VBP program complement a number of other quality improvement efforts overseen by CMS, including the Hospital Readmissions Reduction Program. Effective for discharges beginning on October 1, 2012, hospitals with excess readmissions for AMI, CHF, and pneumonia are at risk for reimbursement reductions for all Medicare admissions in proportion to the rate of excess rehospitalizations. Some of the same concerns about the hospital VBP program outlined above have also been raised for this program, namely, whether readmission penalties will be large enough to impact hospital behavior, whether readmissions are even preventable,[33, 34] and whether adjustments in hospital‐level policies will reduce admissions that are known to be heavily influenced by patient economic and social factors that are outside of a hospital's control.[35, 36] Despite the limitations of VBP and the challenges that lie ahead, there is optimism that rewarding hospitals that provide high‐value rather than high‐volume care will not only improve outcomes of hospitalized patients in the United States, but will potentially be able to do so at a lower cost. Encouraging hospitals to improve their quality of care may also have important spillover effects on other healthcare domains. For example, hospitals that adopt systems to ensure prompt delivery of antibiotics to patients with pneumonia may also observe positive spillover effects with the prompt antibiotic management of other acute infectious illnesses that are not covered by VBP. VBP may have spillover effects on medical malpractice liability and defensive medicine as well. Indeed, financial incentives to practice higher‐quality evidenced‐based care may reduce medical malpractice liability and defensive medicine.

The government's ultimate goal in implementing VBP is to identify a broad and clinically relevant set of outcome measures that can be used to incentivize hospitals to deliver high‐quality as opposed to high‐volume healthcare. The first wave of outcome measures has already been instituted. It remains to be seen whether the incentive rewards of Medicare's hospital VBP program will be large enough that hospitals feel compelled to improve and compete for them.

The Centers for Medicaid and Medicare Services' (CMS) Hospital Inpatient Value‐Based Purchasing (VBP) Program, which was signed into law as part of the Patient Protection and Affordable Care Act of 2010, aims to incentivize inpatient providers to deliver high‐value, as opposed to high‐volume, healthcare.[1] Beginning on October 1, 2012, the start of the 2013 fiscal year (FY), hospitals participating in the VBP program became eligible for a variety of performance‐based incentive payments from CMS. These payments are based on an acute care hospital's ability to meet performance measurements in 6 care domains: (1) patient safety, (2) care coordination, (3) clinical processes and outcomes, (4) population or community health, (5) efficiency and cost reduction, and (6) patient‐ and caregiver‐centered experience.[2] The VBP program's ultimate purpose is to enable CMS to improve the health of Medicare beneficiaries by purchasing better care for them at a lower cost. These 3 characteristics of careimproved health, improved care, and lower costsare the foundation of CMS' conception of value.[1, 2] They are closely related to an economic conception of value, which is the difference between an intervention's benefit and its cost.

Although in principle not a new idea, the formal mandate of hospitals to provide high‐value healthcare through financial incentives marks an important change in Medicare and Medicaid policy. In this opportune review of VBP, we first discuss the relevant historical changes in the reimbursement environment of US hospitals that have set the stage for VBP. We then describe the structure of CMS' VBP program, with a focus on which facilities are eligible to participate in the program, the specific outcomes measured and incentivized, how rewards and penalties are allocated, and how the program will be funded. In an effort to anticipate some of the issues that lie ahead, we then highlight a number of potential challenges to the success of VBP, and discuss how VBP will impact the delivery and reimbursement of inpatient care services. We conclude by examining how the VBP program is likely to evolve over time.

HISTORICAL CONTEXT FOR VBP

Over the last decade, CMS has embarked on a number of initiatives to incentivize the provision of higher‐quality and more cost‐effective care. For example, in 2003, CMS implemented a national pay‐for‐performance (P4P) pilot project called the Premier Hospital Quality Incentive Demonstration (HQID).[3, 4] HQID, which ran for 6 years, tracked and rewarded the performance of 216 hospitals in 6 healthcare service domains: (1) acute myocardial infarction (AMI), (2) congestive heart failure (CHF), (3) pneumonia, (4) coronary artery bypass graft surgery, (5) hip and knee replacement surgery, and (6) perioperative management of surgical patients (including prevention of surgical site infections).[4] CMS then introduced its Hospital Compare Web site in 2005 to facilitate public reporting of hospital‐level quality outcomes.[3, 5] This Web site provides the public with access to data on hospital performance across a wide array of measures of process quality, clinical outcomes, spending, and resource utilization.[5] Next, in October 2008, CMS stopped reimbursing hospitals for a number of costly and common hospital‐acquired complications, including hospital‐acquired bloodstream infections and urinary tract infections, patient falls, and pressure ulcers.[3, 6] VBP is the latest and most comprehensive step that CMS has taken in its decade‐long effort to shift from volume to value‐based compensation for inpatient care.

Although CMS appears fully invested in using performance incentives to increase healthcare value, existing evidence of the effects of P4P on patient outcomes remains quite mixed.[7] On one hand, an analysis of an inpatient P4P program sponsored by the United Kingdom's National Health Service's (NHS) suggests that P4P may improve quality and save lives; indeed, hospitals that participated in the NHS P4P program significantly reduced inpatient mortality from pneumonia, saving an estimated 890 lives.[8] Additional empirical work suggests that the HQID was also associated with early improvements in healthcare quality.[9] However, a subsequent long‐term analysis found that participation in HQID had no discernible effect on 30‐day mortality rates.[10] Moreover, a meta‐analysis of P4P incentives for individual practitioners found few methodologically robust studies of P4P for clinicians and concluded that P4P's effects on individual practice patterns and outcomes remain largely uncertain.[11]

VBP: STRUCTURE AND DESIGN

This section reviews the structure of the VBP program. We describe current VBP eligibility criteria and sources of funding for the program, how hospitals participating in VBP are evaluated, and how VBP incentives for FY 2013 have been calculated.

Hospital Eligibility for VBP

All acute care hospitals in the United States (excluding Maryland) that are not psychiatric hospitals, rehabilitation hospitals, long‐term care facilities, children's hospitals, or cancer hospitals are eligible to participate in VBP in FY 2013 (full eligibility criteria is outlined in Table 1). For FY 2013, CMS chose to incentivize measures in just 2 care domains: (1) clinical processes of care and (2) patient experience of care. To be eligible for VBP in FY 2013, a hospital must report at least 10 cases each in at least 4 of 12 measures included in the clinical processes of care domain (Table 2), and/or must have at least 100 completed Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS). Designed and validated by CMS, the HCAHPS survey provides hospitals with a standardized instrument for gathering information about patient satisfaction with, and perspectives on, their hospital care.[12] HCAHPS will be used to assess 8 patient experience of care measures (Table 3).

Inclusion and Exclusion Criteria for the Inpatient Value‐Based Purchasing Program in Fiscal Year 2013
  • NOTE: Abbreviations: HCAHPS, Hospital Consumer Assessment of Healthcare Providers and Systems; HHS, US Department of Health and Human Services; VBP, Value‐Based Purchasing.

Inclusion criteria
Acute care hospital
Located in all 50 US states or District of Columbia (excluding Maryland)
Has at least 10 cases in at least 4 of 12 clinical process of care measures and/or at least 100 completed HCAHPS surveys
Exclusion criteria
Psychiatric, rehabilitation, long‐term care, children's or cancer hospital
Does not participate in Hospital Inpatient Quality Reporting Program during the VBP performance period
Cited by the Secretary of HHS for significant patient safety violations during performance period
Hospital does not meet minimum reporting requirements for number of cases, process measures, and surveys needed to participate in VBP
Clinical Process of Care Measures Evaluated by Value‐Based Purchasing in Fiscal Year 2013
Disease Process Process of Care Measure
  • NOTE: Mortality measures to be added in fiscal year 2014: acute myocardial infarction, congestive heart failure, pneumonia.

Acute myocardial infarction Fibrinolytic therapy received within 30 minutes of hospital arrival
Primary percutaneous coronary intervention received within 90 minutes of hospital arrival
Heart failure Discharge instructions provided
Pneumonia Blood cultures performed in the emergency department prior to initial antibiotic received in hospital
Initial antibiotic selection for community‐acquired pneumonia in immunocompetent patient
Healthcare‐associated infections Prophylactic antibiotic received within 1 hour prior to surgical incision
Prophylactic antibiotic selection for surgical patients
Prophylactic antibiotics discontinued within 24 hours after surgery ends
Cardiac surgery patients with controlled 6:00 am postoperative serum glucose
Surgeries Surgery patients on ‐blocker prior to arrival that received ‐blocker during perioperative period
Surgery patients with recommended venous thromboembolism prophylaxis ordered
Surgery patients who received appropriate venous thromboembolism prophylaxis within 24 hours prior to surgery to 24 hours after surgery
Patient Experience of Care Measures Evaluated by Value‐Based Purchasing in Fiscal Year 2013
Communication with nurses
Communication with doctors
Responsiveness of hospital staff
Pain management
Communication about medicines
Cleanliness and quietness of hospital environment
Discharge information
Overall rating of hospital

Participation in the program is mandatory for eligible hospitals, and CMS estimates that more than 3000 facilities across the United States will participate in FY 2013. Roughly $850 million dollars in VBP incentives will be paid out to these participating hospitals in FY 2013. The program is being financed through a 1% across‐the‐board reduction in FY 2013 diagnosis‐related group (DRG)‐based inpatient payments to participating hospitals. On December 20, 2012, CMS publically announced FY 2013 VBP incentives for all participating hospitals. Each hospital's incentive is retroactive and based on its performance between July 1, 2011 and March 31, 2012.

All data used for calculating VBP incentives is reported to CMS through its Hospital Inpatient Quality Reporting (Hospital IQR) Program, a national program instituted in 2003 that rewards hospitals for reporting designated quality measures. As of 2007, approximately 95% of eligible US hospitals were using the Hospital IQR program.[1] Measures evaluated via chart abstracts and surveys reflect a hospital's performance for its entire patient population, whereas measures assessed with claims data reflect hospital performance only for Medicare patients.

Evaluation of Hospitals

In FY 2013, hospital VBP incentive payments will be based entirely on performance in 2 domains: (1) clinical processes of care (weighted 70%) and (2) patient experience of care (weighted 30%). For each domain, CMS will evaluate each hospital's improvement over time as well as achievement compared to other hospitals in the VBP program. By assessing and rewarding both achievement and improvement, CMS will ensure that lower‐performing hospitals will still be rewarded for making substantial improvements in quality. To evaluate the first metricimprovement over timeCMS will compare a hospital's performance during a given reporting period with its baseline performance 2 years prior to this block of time. A hospital receives improvement points for improving its performance over time. To assess the second metricachievement compared to other hospitals in the VBP programCMS will compare each hospital's performance during a reporting period with the baseline performance (eg, performance 2 years prior to reporting period) of all other hospitals in the VBP program. A hospital is awarded achievement points if its performance exceeds the 50th percentile of all hospitals during the baseline performance period. Improvement scores range from 0 to 9, whereas achievement scores range from 0 to 10. The greater of a hospital's improvement and achievement scores on each VBP measure are used to calculate each hospital's total earned clinical care domain score and total earned HCAHPS base score. Hospitals that lack baseline performance data, which is required to assess improvement, will be evaluated solely on the basis of achievement points.[1] The total earned clinical care domain score is multiplied by 70% to reach the clinical care domain's contribution to a hospital's total performance score.

Each hospital's total patient experience domain, or HCAHPS performance, score consists of 2 components: a total earned HCAHPS base score as described above and a consistency score. The consistency score evaluates the reliability of a hospital's performance across all 8 patient experience of care measures (Table 3). If a hospital is above the 50th percentile of all hospital scores during the baseline period on all 8 measures, then it receives 100% of its consistency points. If a hospital is at the 0 percentile for a given measure, then it receives 0 consistency points for all measures. This provision promotes consistency by harshly penalizing hospitals with extremely poor performance on any 1 specific measure. If 1 or more measures are between the 0 and 50th percentiles, then it will receive a consistency score that takes into account how many measures were below the 50th percentile and their distance from this threshold. Each hospital's total HCAHPS performance score (the sum of total earned HCAHPS base points and consistency points) is then multiplied by 30% to arrive at the patient experience of care domain's contribution to a hospital's total performance score.

Importantly, CMS excluded from its VBP initiative 10 clinical process measures reported in the Hospital IQR Program because they are topped out; that is, almost all hospitals already perform them at very high rates (Table 4). Examples of these topped out process measures include administration of aspirin to all patients with AMI on arrival at the hospital; counseling of patients with AMI, CHF, and pneumonia about smoking cessation; and prescribing angiotensin‐converting enzyme inhibitors or angiotensin receptor blockers to patients with CHF and left ventricular dysfunction.[1]

Topped Out Measures
Disease Process Measure
  • NOTE: Abbreviations: ACEI, angiotensin‐converting enzyme inhibitor; ARB, angiotensin receptor blocker.

Acute myocardial infarction Aspirin administered on arrival to the emergency department
ACEI or ARB prescribed on discharge
Patient counseled about smoking cessation
‐Blocker prescribed on discharge
Aspirin prescribed at discharge
Heart failure Patient counseled about smoking cessation
Evaluation of left ventricular systolic function
ACEI or ARB prescribed for left ventricular systolic dysfunction
Pneumonia Patient counseled about smoking cessation
Surgical Care Improvement Project Surgery patients with appropriate hair removal

Calculation of VBP Incentives and Public Reporting

A hospital's total performance score for FY 2013 is equal to the sum of 70% of its clinical care domain score and 30% of its total HCAHPS performance score. This total performance score is entered into a linear mathematical formula to calculate each hospital's incentive payment. CMS projects that VBP will lead to a net increase in Medicare payments for one‐half of hospitals and a net decrease in payments for the other half of participating facilities.[1]

In December 2012, CMS publicly disclosed information about the initial performance of each hospital in the VBP program. Reported information included: (1) hospital performance for each applicable performance measure, (2) hospital performance by disease condition or procedure, and (3) hospital's total performance score. Initial analyses of this performance data revealed that 1557 hospitals will receive bonus payments under VBP in FY 2013, whereas 1427 hospitals will lose money under this program. Treasure Valley Hospital, a 10‐bed physician‐owned hospital in Boise, Idaho, will receive a 0.83% increase in Medicare payments, the largest payment increase under VBP in 2013. Conversely, Auburn Community Hospital in upstate New York, will suffer the most severe payment reduction: 0.9% per Medicare admission. The penalty will cost Auburn Hospital about $100,000, which is slightly more than 0.1% of its yearly $85 million operating budget.[13] For almost two‐thirds of participating hospitals, FY 2013 Medicare payments will change by <0.25%.[13] Additional information about VBP payments for FY 2013, including the number of hospitals who received VBP incentives and the size and range of these payments, is now accessible to the public through CMS' Hospital Compare Web site (http://www.hospitalcompare.hhs.gov).

CHALLENGES OF VBP

As the Medicare VBP program evolves, and hospitals confront ever‐larger financial incentives to deliver high‐value as opposed to high‐volume care, it will be important to recognize limitations of the VBP program as they arise. Here we briefly discuss several conceptual and implementation challenges that physicians and policymakers should consider when assessing the merits of VBP in promoting high‐quality healthcare.

Rigorous and Continuous Evaluation of VBP Programs

The main premise of using VBP to incentivize hospitals to deliver high‐quality cost‐effective care is that the process measures used to determine hospital quality do impact patient outcomes. However, it is already well established that improvements in measures of process quality are not always associated with improvements in patient outcomes.[14, 15, 16] Moreover, incentivizing specific process measures encourages hospitals to shift resources away from other aspects of care delivery, which may have ambiguous, or even deleterious, effects on patient outcomes. Although incentives ideally push hospitals to shift resources away from low‐quality care toward high‐quality care, in practice this is not always the case. Hospital resources may instead be drawn away from areas that are not yet incented by VBP, but for which improvements in quality of care are desperately needed. The same empirical focus behind using VBP to incentivize hospitals to improve patient outcomes efficiently should be used to evaluate whether VBP is continually meeting its stated goals: reducing overall patient morbidity and mortality and improving patient satisfaction at ideally lower cost. The experience of the US education system with public policies designed to improve student testing performance may serve as a cautionary example here. Such policies, which provide financial rewards to schools whose students perform well on standardized tests, can indeed raise testing performance. However, these policies also lead educators to teach to the test, and to neglect important topics that are not tested on standardized exams.[17]

Prioritization of Process Measures

As payment incentives for VBP currently stand, process measures are weighted equally regardless of the clinical benefits they generate and the resources required to achieve improvements in process quality. For instance, 2 process measures, continuing home ‐blocker medications for patients with coronary artery disease undergoing surgery and early percutaneous coronary intervention for patients with AMI, may be weighted equally as process measures although both their clinical benefits and the costs of implementation are very different. Some hospitals responding to VBP incentives may choose to invest in areas where their ability to earn VBP incentive payments is high and the costs of improvement are low, although those areas may not be where interventions are most needed because clinical outcomes could be most improved. Recognizing that process measures have heterogeneous benefits and costs of implementation is important when prioritizing their reimbursement in VBP.

Measuring Improvements in Hospital Quality

Tying hospital financial compensation to hospital quality implies that measures of hospital quality should be robust. To incentivize hospitals to improve quality not only relative to other hospitals but to themselves in the past, the VBP program has established a baseline performance for each hospital. Each hospital is compared to its baseline performance in subsequent evaluation periods. Thus, properly measuring a hospital's baseline performance is important. During a given baseline period, some hospitals may have better or worse outcomes than their steady state due to random variation alone. Some hospitals deemed to have a low baseline will experience improvements in quality that are not related to active efforts to improve quality but through chance alone. Similarly, some hospitals deemed to have a high baseline will experience reductions in quality through chance. Of course, neither of these changes should be subject to differences in reimbursement because they do not reflect actual organizational changes made by the hospitals. The VBP program has made significant efforts to address this issue by requiring participating hospitals to have a large enough sample of cases such that estimated rates of process quality adherence meet a reliability threshold (ie, are likely to be consistent over time rather than vary substantially through chance alone). However, not all process measures exhibit high reliability, particularly those for which adverse events are rare (eg, foreign objects retained after surgery, air embolisms, and blood incompatibility). Ultimately, CMS's decision to balance the need for statistically reliable data with the goal of including as many hospitals as possible in the VBP program will require ongoing reevaluation of this issue.

Choosing Hospital Comparators Appropriately

In the current VBP program, hospitals will be evaluated in part by how they compare to hospitals nationally. However, studies of regional variation in healthcare have demonstrated large variations in practice patterns across the United States,[18, 19, 20] raising the question of whether hospitals should, at least initially, be compared to hospitals in the same geographic area. Although the ultimate goal of VBP should be to hold hospitals to a national standard, local practice patterns are not easily modified within 1‐ to 2‐year timeframes. Initially comparing hospitals to a national rather than local standard may unfairly penalize hospitals that are relative underperformers nationally but overperformers regionally. Although CMS's policy to reward improvement within hospitals over time mitigates issues arising from a cross‐sectional comparison of hospitals, the issue still remains if many hospitals within a region not only underperform relative to other hospitals nationally but also fail to demonstrate improvement. More broadly, this issue extends to differences across hospitals in factors that impact their ability to meet VBP goals. These factors may include, for example, hospital size, profitability, patient case and insurance mix, and presence of an electronic medical record. Comparing hospitals with vastly different abilities to achieve VBP goals and improve quickly may amount to inequitable policy.

Continual Evaluation of Topped‐Out Measures

Process measures that are met at high rates at nearly all hospitals are not used in evaluations by CMS for VBP. An assumption underlying CMS' decision to not reward hospitals for achieving these topped‐out measures is that once physicians and hospitals make cognitive and system‐level improvements that improve process quality, these gains will persist after the incentive is removed. Thus, CMS hopes and anticipates that although performance incentives will make it easier for well‐meaning physicians to learn to do the right thing, doctors will continue to do the right things for patients after these incentives are removed.[21, 22] Although this assumption may generally be accurate, it is important to continue to evaluate whether measures that are currently topped out continue to remain adequately performed, because rewarding new quality measures will necessarily lead hospitals to reallocate resources away from other clinical activities. Although we hope that the continued public reporting of topped‐out measures will prevent declines in performance on these measures, policy makers and clinicians should be aware that the lack of financial incentives for topped‐out measures may result in declines in quality. To this point, an analysis of 35 Kaiser Permanente facilities from 1997 to 2007 demonstrated that the removal of financial incentives for diabetic retinopathy and cervical cancer screening was associated with subsequent declines in performance of 3% and 1.6% per year, respectively.[23]

Will VBP Incentives Be Large Enough to Change Practice Patterns?

The VBP Program's ability to influence change depends, at least in part, on how the incentives offered under this program compare to the magnitude of the investments that hospitals must make to achieve a given reward. In general, larger incentives are necessary to motivate more significant changes in behavior or to influence organizations to invest the resources needed to achieve change. The incentives offered under VBP in FY 2013 are quite modest. Almost two‐thirds of participating hospitals will see their FY 2013 Medicare revenues change by <0.25%, roughly $125,000 at most.[13, 24] Although these incentives may motivate hospitals that can improve performance and achievement with very modest investments, they may have little impact on organizations that need to make significant upfront investments in care processes to achieve sustainable improvements in care quality. As CMS increases the size of VBP incentives over the next 2 to 4 years, it will also hold hospitals accountable for a broader and increasingly complex set of outcomes. Improving these outcomes may require investments in areas such as information technology and process improvement that far surpass the VBP incentive reward.

Moreover, prior research suggests that financial incentives like those available under VBP may contribute only slightly to performance improvements when public reporting already exists. For example, in a 2‐year study of 613 US hospitals implementing pay‐for‐performance plus public reporting or public reporting only, pay for performance plus public reporting was associated with only a 2.6% to 4.1% increase in a composite measure of quality when compared to hospitals with public reporting only.[9] Similarly, a study of 54 hospitals participating in the CMS pay for performance pilot initiative found no significant improvement in quality of care or outcomes for AMI when compared to 446 control hospitals.[25] A long‐term analysis of pay for performance in the Medicare Premier Hospital Quality Incentive Demonstration found that participation in the program had no discernible effect on 30‐day mortality rates.[10] Finally, a study of physician medical groups contracting with a large network healthcare maintenance organization found that the implementation of pay for performance did not result in major before and after improvements in clinical quality compared to a control group of medical groups.[26]

High‐Value Care Is Not Always Low‐Cost Care

Not surprisingly, the clinical process measures included in CMS' hospital VBP program evaluate a select and relatively small group of high‐value and low‐cost interventions (eg, appropriate administration of antibiotics and tight control of serum glucose in surgical patients). However, an important body of work has demonstrated that high‐cost care (eg, intensive inpatient hospital care for common acute medical conditions) may also be highly valuable in terms of improving survival.[20, 27, 28, 29, 30] As the hospital VBP program evolves, its overseers will need to consider whether to include additional incentives for high‐value high‐cost healthcare services. Such considerations will likely become increasingly salient as healthcare delivery organizations move toward capitated delivery models. In particular, the VBP program's Medicare Spending Per Beneficiary measure, which quantifies inpatient and subsequent outpatient spending per beneficiary after a given hospitalization episode, will need to distinguish between higher‐spending hospitals that provide highly effective care (eg, care that reduces mortality and readmissions) and facilities that provide less‐effective care.

FUTURE OF VBP

Although the future of VBP is unknown, CMS is likely to modify the program in a number of ways over the next 3 to 5 years. First, CMS will likely expand the breadth and focus of incentivized measures in the VBP program. In FY 2014, for example, CMS is adding a set of 3, 30‐day mortality outcome measures to VBP: 30‐day risk‐adjusted mortality for AMI, CHF, and pneumonia.[1] A hospital's performance with respect to these outcomes will represent 25% of its total performance score in 2014, whereas the clinical process of care and patient experience of care domains will account for 45% and 30% of this score, respectively. In 2015, patient experience and outcome measures will account for 30% each in a hospital's performance score, whereas process and efficiency measures will each account for 20% of this score, respectively. The composition of this performance score evidences a shift away from rewarding process‐based measures and toward incentivizing measures of clinical outcomes and patient satisfaction, the latter of which may be highly subjective and more representative of a hospital's catchment population than of a hospital's care itself.[31] Additional measures in the domains of patient safety, care coordination, population and community health, emergency room wait times, and cost control may also be added to the VBP program in FY 2015 to FY 2017. Furthermore, CMS will continue to reevaluate the appropriateness of measures that are already included in VBP and will stop incentivizing measures that have become topped out, or are no longer supported by the National Quality Forum.[1, 13]

Second, CMS has established an annual gradual increase of 0.25% in the percentage of each hospital's inpatient DRG‐based payment that is at stake under VBP. In FY 2014, for example, participating hospitals will be required to contribute 1.25% of inpatient DRG payments to the VBP program. This percentage is likely to increase to 2% or more by 2017.[1, 32]

Third, expansions of the VBP program complement a number of other quality improvement efforts overseen by CMS, including the Hospital Readmissions Reduction Program. Effective for discharges beginning on October 1, 2012, hospitals with excess readmissions for AMI, CHF, and pneumonia are at risk for reimbursement reductions for all Medicare admissions in proportion to the rate of excess rehospitalizations. Some of the same concerns about the hospital VBP program outlined above have also been raised for this program, namely, whether readmission penalties will be large enough to impact hospital behavior, whether readmissions are even preventable,[33, 34] and whether adjustments in hospital‐level policies will reduce admissions that are known to be heavily influenced by patient economic and social factors that are outside of a hospital's control.[35, 36] Despite the limitations of VBP and the challenges that lie ahead, there is optimism that rewarding hospitals that provide high‐value rather than high‐volume care will not only improve outcomes of hospitalized patients in the United States, but will potentially be able to do so at a lower cost. Encouraging hospitals to improve their quality of care may also have important spillover effects on other healthcare domains. For example, hospitals that adopt systems to ensure prompt delivery of antibiotics to patients with pneumonia may also observe positive spillover effects with the prompt antibiotic management of other acute infectious illnesses that are not covered by VBP. VBP may have spillover effects on medical malpractice liability and defensive medicine as well. Indeed, financial incentives to practice higher‐quality evidenced‐based care may reduce medical malpractice liability and defensive medicine.

The government's ultimate goal in implementing VBP is to identify a broad and clinically relevant set of outcome measures that can be used to incentivize hospitals to deliver high‐quality as opposed to high‐volume healthcare. The first wave of outcome measures has already been instituted. It remains to be seen whether the incentive rewards of Medicare's hospital VBP program will be large enough that hospitals feel compelled to improve and compete for them.

References
  1. Centers for Medicare and Medicaid Services. Hospital Value‐Based Purchasing Web site. 2013. Available at: http://www.cms.gov/Medicare/Quality‐Initiatives‐Patient‐Assessment‐Instruments/hospital‐value‐based‐purchasing/index.html. Accessed March 4, 2013.
  2. VanLare JM, Conway PH. Value‐based purchasing—national programs to move from volume to value. N Engl J Med. 2012;367:292295.
  3. Joynt KE, Rosenthal MB. Hospital value‐based purchasing: will Medicare's new policy exacerbate disparities? Circ Cardiovasc Qual Outcomes. 2012;5:148149.
  4. Centers for Medicare and Medicaid Services. CMS/premier hospital quality incentive demonstration (QHID). 2013. Available at: https://www.premierinc.com/quality‐safety/tools‐services/p4p/hqi/faqs.jsp. Accessed March 5, 2013.
  5. Centers for Medicare and Medicaid Services. Hospital Compare Web site. 2013. Available at: http://www.medicare.gov/hospitalcompare. Accessed March 4, 2013.
  6. Brown J, Doloresco F, Mylotte JM. “Never events”: not every hospital‐acquired infection is preventable. Clin Infect Dis. 2009;49:743746.
  7. Epstein AM. Will pay for performance improve quality of care? The answer is in the details. N Engl J Med. 2012;367:18521853.
  8. Sutton M, Nikolova S, Boaden R, Lester H, McDonald R, Roland M. Reduced mortality with hospital pay for performance in England. N Engl J Med. 2012;367:18211828.
  9. Lindenauer PK, Remus D, Roman S, et al. Public reporting and pay for performance in hospital quality improvement. N Engl J Med. 2007;356:486496.
  10. Jha AK, Joynt KE, Orav EJ, Epstein AM. The long‐term effect of premier pay for performance on patient outcomes. N Engl J Med. 2012;366:16061615.
  11. Houle SK, McAlister FA, Jackevicius CA, Chuck AW, Tsuyuki RT. Does performance‐based remuneration for individual health care practitioners affect patient care?: a systematic review. Ann Intern Med. 2012;157:889899.
  12. Centers for Medicare and Medicaid Services. Hospital Consumer Assessment Of Healthcare Providers and Systems Web site. 2013. Available at: http://www.hcahpsonline.org. Accessed March 5, 2013.
  13. Rau J. Medicare discloses hospitals' bonuses, penalties based on quality. Kaiser Health News. December 20, 2012. Available at: http://www.kaiserhealthnews.org/stories/2012/december/21/medicare‐hospitals‐value‐based‐purchasing.aspx?referrer=search. Accessed March 26, 2013.
  14. Yasaitis L, Fisher ES, Skinner JS, Chandra A. Hospital quality and intensity of spending: is there an association? Health Aff (Millwood). 2009;28:w566w572.
  15. Fonarow GC, Abraham WT, Albert NM, et al. Association between performance measures and clinical outcomes for patients hospitalized with heart failure. JAMA. 2007;297:6170.
  16. Rubin HR, Pronovost P, Diette GB. The advantages and disadvantages of process‐based measures of health care quality. Int J Qual Health Care. 2001;13:469474.
  17. Jacob BA. Accountability, incentives and behavior: the impact of high‐stakes testing in the Chicago public schools. J Public Econ. 2005;89:761796.
  18. Fisher ES, Wennberg DE, Stukel TA, Gottlieb DJ, Lucas FL, Pinder EL. The implications of regional variations in Medicare spending. Part 1: the content, quality, and accessibility of care. Ann Intern Med. 2003;138:273287.
  19. Fisher ES. Medical care—is more always better? N Engl J Med. 2003;349:16651667.
  20. Romley JA, Jena AB, Goldman DP. Hospital spending and inpatient mortality: evidence from California: an observational study. Ann Intern Med. 2011;154:160167.
  21. James BC. Making it easy to do it right. N Engl J Med. 2001;345:991993.
  22. Christensen RD, Henry E, Ilstrup S, Baer VL. A high rate of compliance with neonatal intensive care unit transfusion guidelines persists even after a program to improve transfusion guideline compliance ended. Transfusion. 2011;51:25192520.
  23. Lester H, Schmittdiel J, Selby J, et al. The impact of removing financial incentives from clinical quality indicators: longitudinal analysis of four Kaiser Permanente indicators. BMJ. 2010;340:c1898.
  24. Werner RM, Dudley RA. Medicare's new hospital value‐based purchasing program is likely to have only a small impact on hospital payments. Health Aff (Millwood). 2012;31:19321940.
  25. Glickman SW, Ou FS, DeLong ER, et al. Pay for performance, quality of care, and outcomes in acute myocardial infarction. JAMA. 2007;297:23732380.
  26. Mullen KJ, Frank RG, Rosenthal MB. Can you get what you pay for? Pay‐for‐performance and the quality of healthcare providers. Rand J Econ. 2010;41:6491.
  27. Romley JA, Jena AB, O'Leary JF, Goldman DP. Spending and mortality in US acute care hospitals. Am J Manag Care. 2013;19:e46e54.
  28. Barnato AE, Farrell MH, Chang CC, Lave JR, Roberts MS, Angus DC. Development and validation of hospital “end‐of‐life” treatment intensity measures. Med Care. 2009;47:10981105.
  29. Ong MK, Mangione CM, Romano PS, et al. Looking forward, looking back: assessing variations in hospital resource use and outcomes for elderly patients with heart failure. Circ Cardiovasc Qual Outcomes. 2009;2:548557.
  30. Stukel TA, Fisher ES, Alter DA, et al. Association of hospital spending intensity with mortality and readmission rates in Ontario hospitals. JAMA. 2012;307:10371045.
  31. Young GJ, Meterko M, Desai KR. Patient satisfaction with hospital care: effects of demographic and institutional characteristics. Med Care. 2000;38:325334.
  32. VanLare JM, Blum JD, Conway PH. Linking performance with payment: implementing the Physician Value‐Based Payment Modifier. JAMA. 2012;308:20892090.
  33. Walraven C, Bennett C, Jennings A, Austin PC, Forster AJ. Proportion of hospital readmissions deemed avoidable: a systematic review. CMAJ. 2011;183:E391E402.
  34. Walraven C, Jennings A, Taljaard M, et al. Incidence of potentially avoidable urgent readmissions and their relation to all‐cause urgent readmissions. CMAJ. 2011;183:E1067E1072.
  35. Joynt KE, Jha AK. Thirty‐day readmissions—truth and consequences. N Engl J Med. 2012;366:13661369.
  36. Joynt KE, Orav EJ, Jha AK. Thirty‐day readmission rates for Medicare beneficiaries by race and site of care. JAMA. 2011;305:675681.
References
  1. Centers for Medicare and Medicaid Services. Hospital Value‐Based Purchasing Web site. 2013. Available at: http://www.cms.gov/Medicare/Quality‐Initiatives‐Patient‐Assessment‐Instruments/hospital‐value‐based‐purchasing/index.html. Accessed March 4, 2013.
  2. VanLare JM, Conway PH. Value‐based purchasing—national programs to move from volume to value. N Engl J Med. 2012;367:292295.
  3. Joynt KE, Rosenthal MB. Hospital value‐based purchasing: will Medicare's new policy exacerbate disparities? Circ Cardiovasc Qual Outcomes. 2012;5:148149.
  4. Centers for Medicare and Medicaid Services. CMS/premier hospital quality incentive demonstration (QHID). 2013. Available at: https://www.premierinc.com/quality‐safety/tools‐services/p4p/hqi/faqs.jsp. Accessed March 5, 2013.
  5. Centers for Medicare and Medicaid Services. Hospital Compare Web site. 2013. Available at: http://www.medicare.gov/hospitalcompare. Accessed March 4, 2013.
  6. Brown J, Doloresco F, Mylotte JM. “Never events”: not every hospital‐acquired infection is preventable. Clin Infect Dis. 2009;49:743746.
  7. Epstein AM. Will pay for performance improve quality of care? The answer is in the details. N Engl J Med. 2012;367:18521853.
  8. Sutton M, Nikolova S, Boaden R, Lester H, McDonald R, Roland M. Reduced mortality with hospital pay for performance in England. N Engl J Med. 2012;367:18211828.
  9. Lindenauer PK, Remus D, Roman S, et al. Public reporting and pay for performance in hospital quality improvement. N Engl J Med. 2007;356:486496.
  10. Jha AK, Joynt KE, Orav EJ, Epstein AM. The long‐term effect of premier pay for performance on patient outcomes. N Engl J Med. 2012;366:16061615.
  11. Houle SK, McAlister FA, Jackevicius CA, Chuck AW, Tsuyuki RT. Does performance‐based remuneration for individual health care practitioners affect patient care?: a systematic review. Ann Intern Med. 2012;157:889899.
  12. Centers for Medicare and Medicaid Services. Hospital Consumer Assessment Of Healthcare Providers and Systems Web site. 2013. Available at: http://www.hcahpsonline.org. Accessed March 5, 2013.
  13. Rau J. Medicare discloses hospitals' bonuses, penalties based on quality. Kaiser Health News. December 20, 2012. Available at: http://www.kaiserhealthnews.org/stories/2012/december/21/medicare‐hospitals‐value‐based‐purchasing.aspx?referrer=search. Accessed March 26, 2013.
  14. Yasaitis L, Fisher ES, Skinner JS, Chandra A. Hospital quality and intensity of spending: is there an association? Health Aff (Millwood). 2009;28:w566w572.
  15. Fonarow GC, Abraham WT, Albert NM, et al. Association between performance measures and clinical outcomes for patients hospitalized with heart failure. JAMA. 2007;297:6170.
  16. Rubin HR, Pronovost P, Diette GB. The advantages and disadvantages of process‐based measures of health care quality. Int J Qual Health Care. 2001;13:469474.
  17. Jacob BA. Accountability, incentives and behavior: the impact of high‐stakes testing in the Chicago public schools. J Public Econ. 2005;89:761796.
  18. Fisher ES, Wennberg DE, Stukel TA, Gottlieb DJ, Lucas FL, Pinder EL. The implications of regional variations in Medicare spending. Part 1: the content, quality, and accessibility of care. Ann Intern Med. 2003;138:273287.
  19. Fisher ES. Medical care—is more always better? N Engl J Med. 2003;349:16651667.
  20. Romley JA, Jena AB, Goldman DP. Hospital spending and inpatient mortality: evidence from California: an observational study. Ann Intern Med. 2011;154:160167.
  21. James BC. Making it easy to do it right. N Engl J Med. 2001;345:991993.
  22. Christensen RD, Henry E, Ilstrup S, Baer VL. A high rate of compliance with neonatal intensive care unit transfusion guidelines persists even after a program to improve transfusion guideline compliance ended. Transfusion. 2011;51:25192520.
  23. Lester H, Schmittdiel J, Selby J, et al. The impact of removing financial incentives from clinical quality indicators: longitudinal analysis of four Kaiser Permanente indicators. BMJ. 2010;340:c1898.
  24. Werner RM, Dudley RA. Medicare's new hospital value‐based purchasing program is likely to have only a small impact on hospital payments. Health Aff (Millwood). 2012;31:19321940.
  25. Glickman SW, Ou FS, DeLong ER, et al. Pay for performance, quality of care, and outcomes in acute myocardial infarction. JAMA. 2007;297:23732380.
  26. Mullen KJ, Frank RG, Rosenthal MB. Can you get what you pay for? Pay‐for‐performance and the quality of healthcare providers. Rand J Econ. 2010;41:6491.
  27. Romley JA, Jena AB, O'Leary JF, Goldman DP. Spending and mortality in US acute care hospitals. Am J Manag Care. 2013;19:e46e54.
  28. Barnato AE, Farrell MH, Chang CC, Lave JR, Roberts MS, Angus DC. Development and validation of hospital “end‐of‐life” treatment intensity measures. Med Care. 2009;47:10981105.
  29. Ong MK, Mangione CM, Romano PS, et al. Looking forward, looking back: assessing variations in hospital resource use and outcomes for elderly patients with heart failure. Circ Cardiovasc Qual Outcomes. 2009;2:548557.
  30. Stukel TA, Fisher ES, Alter DA, et al. Association of hospital spending intensity with mortality and readmission rates in Ontario hospitals. JAMA. 2012;307:10371045.
  31. Young GJ, Meterko M, Desai KR. Patient satisfaction with hospital care: effects of demographic and institutional characteristics. Med Care. 2000;38:325334.
  32. VanLare JM, Blum JD, Conway PH. Linking performance with payment: implementing the Physician Value‐Based Payment Modifier. JAMA. 2012;308:20892090.
  33. Walraven C, Bennett C, Jennings A, Austin PC, Forster AJ. Proportion of hospital readmissions deemed avoidable: a systematic review. CMAJ. 2011;183:E391E402.
  34. Walraven C, Jennings A, Taljaard M, et al. Incidence of potentially avoidable urgent readmissions and their relation to all‐cause urgent readmissions. CMAJ. 2011;183:E1067E1072.
  35. Joynt KE, Jha AK. Thirty‐day readmissions—truth and consequences. N Engl J Med. 2012;366:13661369.
  36. Joynt KE, Orav EJ, Jha AK. Thirty‐day readmission rates for Medicare beneficiaries by race and site of care. JAMA. 2011;305:675681.
Issue
Journal of Hospital Medicine - 8(5)
Issue
Journal of Hospital Medicine - 8(5)
Page Number
271-277
Page Number
271-277
Publications
Publications
Article Type
Display Headline
Hospital value‐based purchasing
Display Headline
Hospital value‐based purchasing
Sections
Article Source
Copyright © 2013 Society of Hospital Medicine
Disallow All Ads
Correspondence Location
Address for correspondence and reprint requests: Anupam B. Jena, MD, PhD, Department of Health Care Policy, Harvard Medical School, 180 Longwood Avenue, Boston, MA 02115; Telephone: 617‐432‐8322; Fax: 617‐432‐0173. E‐mail: [email protected]
Content Gating
Gated (full article locked unless allowed per User)
Gating Strategy
First Peek Free
Article PDF Media
Media Files