Unbiased Analysis of Today's Healthcare Issues

Evaluating VBP Programs

Written By: Jason Shafrin - Dec• 08•10

The call for the adoption of value based purchasing programs has gained in popularity in recent years.  These programs give physicians, hospitals, or other providers bonuses (or penalties) depending on the quality of care their patients receive. The Affordable Care Act (ACA a.k.a Health Reform) includes provisions to establish a VBP program for hospital payments based on hospital quality reporting; a national, voluntary, 5-year bundled payment pilot program; and a new payment structure for providers organized as accountable care organizations.

Despite their popularity, evaluating the health outcomes or even documenting the processes required to produce positive health outcomes is difficult.  As these new VBP programs come online, it will be increasingly important to evaluate these demonstrations and identify best practices.  A paper by McHugh and Joshi (2010) makes some recommendation on how to improve evaluations of value-based purchasing programs.  A summary of their recommendations are below.

Problem Recommendation Focus Target Audience
Limited information on implementation and management of VBP programs Early and continuous collection of data on implementation Methods, infrastructure support Researchers, policymakers, providers
Limited generalizability of findings More experimentation and greater variation in VBP Data, methods Researchers, policymakers, providers
Lack of meaningful outcome measures Improved methods for risk adjustment, data validation, and measurement composition Data, methods Policymakers, providers, researchers
Lack of integrated and aggregated data Support for EHR and HIT systems Infrastructure support Policymakers, providers
Limited ability to synthesize learning from diverse VBP efforts Better practices and methods for synthesizing VBP program findings Methods, infrastructure support Researchers, policymakers, providers

Most of these recommendations are sensible.  For instance, including data ‘checks’ to ensure valid collection of data would be useful.   Some suggestions, however, are more controversial.  In particular, the authors ask for “more experimentation and greater variation in VBP.”  Experimentation involves a tradeoff, however.  If I was a patient at a hospital under a VBP system, I would hope that the VBP program would be optimized given the current state of knowledge.  Experimentation could of course produce a superior system, but it could also create a worse one.  Thus, although increasing variation in VBP implementation would help researchers learn more and better understand if and under what circumstances VBP works, payers have a duty to make sure patient care is also optimized in the short-run.

The authors give the example of Geisinger Health System’s ProvenCare which offers a single-episode price for CABG surgeries.  Geisinger’s integrated health system likely contributed to the success of this program.  In more decentralized health systems, should the goal be to implement a ProvenCare replica to see if it works in other settings or to design a VBP that is more tailored to the needs of the specific patients and providers it serves.  I’d tend to side with the latter strategy.

You can follow any responses to this entry through the RSS 2.0 feed. You can leave a response, or trackback from your own site.

One Comment

  1. Che Resa says:

    You can begin to assess quality care by what you don’t have: hospital-acquired infections, falls, medication errors, repeat labs, etc., … let alone serious events!

Leave a Reply

Your email address will not be published. Required fields are marked *