Get a group of health policy experts together and you’ll find one area of near universal agreement: we need more transparency in healthcare. The notion behind transparency is straightforward; greater availability of data on provider performance helps consumers make better choices and motivates providers to improve. And there is some evidence to suggest it works. In New York State, after cardiac surgery reporting went into effect, some of the worst performing surgeons stopped practicing or moved out of state and overall outcomes improved. But when it comes to hospital care, the impact of transparency has been less clear-cut.
In 2005, Hospital Compare, the national website run by the Centers for Medicare and Medicaid Services (CMS), started publicly reporting hospital performance on process measures – many of which were evidence based (e.g. using aspirin for acute MI patients). By 2008, evidence showed that public reporting had dramatically increased adherence to those process measures, but its impact on patient outcomes was unknown. A few years ago, Andrew Ryan published an excellent paper in Health Affairs examining just that, and found that more than 3 years after Hospital Compare went into effect, there had been no meaningful impact on patient outcomes. Here’s one figure from that paper:
The paper was widely covered in the press — many saw it as a failure of public reporting. Others wondered if it was a failure of Hospital Compare, where the data were difficult to analyze. Some critics shot back that Ryan had only examined the time period when public reporting of process measures was in effect and it would take public reporting of outcomes (i.e. mortality) to actually move the needle on lowering mortality rates. And, in 2009, CMS started doing just that – publicly reporting mortality rates for nearly every hospital in the country. Would it work? Would it actually lead to better outcomes? We didn’t know – and decided to find out.