Grading Hospital Report Cards (Again)

flying cadeuciiMedicare recently delayed a plan to issue a simple “star” rating of individual hospitals’ care after 60 senators and 225 House members signed letters supported by major industry groups that questioned Medicare’s methodology.

Rick Pollack, president and chief executive officer of the American Hospital Association (AHA), hailed the hiatus and pledged to make ratings more “useful and helpful for patients.” Perhaps. But while a summary grade for care quality has never fit hospitals—where the orthopedists could have a leg up on competitors, while the cardiac surgeons’ results are disheartening—it’s also true that hospitals have consistently fought attempts at transparency. Over an astonishing stretch of almost 100 years, they’ve done so crudely (burning the results of the first national quality survey in a hotel furnace to keep them from the press), through the courts (suing to prevent release of infection data), and using political clout.

Nonetheless, if the hospital groups that sought this delay—the AHA, the Association of American Medical Colleges, America’s Essential Hospitals, and the Federation of American Hospitals—truly seek to sever the industry from its self-protective past, they don’t have to wait for government. As the Medicare Access & CHIP Reauthorization Act of 2015 (MACRA) accelerates the move to value-based payment, there are significant actions hospitals could take to build a better report card right now.

Room For Improvement

The biggest opportunity lies in data reliability. Pollack rightly asserts that “consumers need reliable, factual information to make critical care decisions.” However, in a recent speech to health care journalists, Doris Peter, director of the Health Ratings Center for Consumer Reports, singled out underreporting of data by hospitals and other providers, missing data, and “no quality control around data” as a major challenge.

Hospital groups could help solve this problem by requiring their members’ data submissions to meet the same rigorous criteria the industry demands of the government’s star rating program.

In addition, hospitals that are part of a local health system are currently lumped into a single Medicare provider file. That’s like consumers getting one local health department report for four separate restaurants having one owner. Hospital trade groups should support data changes that enable the government to break out individual hospital data and, right now, urge their members (like the hospital system with a facility near my home!) to voluntarily do so.


There’s also an urgent need to improve data timeliness. Information on Medicare’s Hospital Compare site is typically a year or more old, as is information on other sites dependent upon government claims date. States can be much worse. New York provides detailed heart surgery mortality information on both hospitals and individual surgeons. Yet the latest public report on this serious procedure covers the three-year period 2010-2012! The time lag undermines usefulness while providing ammunition to critics who claim that the public doesn’t care about report cards.

To support more meaningful transparency, industry leaders should call on all hospitals to routinely release detailed quality metrics as quickly as they’re internally available. At the same time, the four groups who sought the Medicare star ratings delay could ask those 60 senators and 226 Congressmen to mandate and fund more rapid claims data processing. State hospital associations could launch similar initiatives.

Of course, it’s important to emphasize that hospitals don’t have to wait for government permission to release their own data or format it in a manner consumers can understand.

Unfortunately, many hospitals remain highly selective about releasing information. To cite one prominent example, the website of Harvard’s Massachusetts General Hospital boldly displays its ranking for 2015-2016 from U.S. News & World Reportas “Number 1 in the Nation.” However, specifics shared on the Mass General website might as well come with cobwebs. To pick just a few categories: data on surgical events, care management events, and door-to-needle time are from 2013. And patient falls, “managing serious errors,” and hand hygiene data date back to 2014.

By the way, if you’re curious how Mass General stacks up against household names like the Mayo Clinic hospitals, Johns Hopkins, or Stanford, so are they. Membership organizations like Vizient allow hospitals to share detailed quality reports confidentially among themselves.


One area where Mass General does exhibit admirable transparency is posting its latest survey results from the Joint Commission. Although the Joint Commission has deemed status from Medicare (which means its surveys are deemed an acceptable substitute for government inspections), it is very difficult to find individual facility information on the commission website. The AHA and other provider organizations that control the commission board could transform opacity to openness overnight. Hospital trade groups could also recommend a voluntary “best practice” to all their members that includes prompt and prominent posting of survey results.

Speaking of best practices, Kentucky’s Norton Healthcare system reports in a simple visual format almost 600 nationally recognized quality indicators for the system and its individual facilities. The effort began in 2005 by disclosing about 200 indicators and includes core principles such as, “We do not decide what to make public based on how it makes us look.” The hospital industry could recommend that Kentucky’s two senators, Majority Leader Mitch McConnell and former GOP presidential candidate Rand Paul, sponsor hearings on what the private sector and government can learn from Norton’s experience.

The research literature amply documents the conflicting messages and confusionthat confound many current report cards. Most recently, the accuracy of a group of commonly used patient safety measures was the subject of academic criticism and a strongly worded response.

Providers understandably worry that the portrayal of the care they provide will be distorted by flawed analytical methods, and there is a need for more research on the best way to communicate comparative information. But as advocates for transparency have been forced to repeatedly emphasize over the years, it is also dangerous to make the impossibly perfect the enemy of the achievable good.

At about the same time that hospital groups were halting a potentially misleading report card from the government, a new policy analysis predicted that expanded consumer use of provider performance report cards lies just ahead. Individual hospitals and the industry remain quick to point out flaws in others’ report cards to the press and policymakers while keeping mum about the wealth of comparative quality information they use internally. That is particularly true of clinical measures, the “gold standard” of accuracy in quality assessment.

Although star ratings may be more suitable for hotels than hospitals, the hospital industry should still be held accountable for helping build a “useful and helpful for patients” alternative. Hospital groups don’t have to wait for government to lead their members to taking the specific steps needed to build a reliable, timely, and complete report card the public can trust.

Michael Millenson is a principal at Health Quality Advisors LLC. This post first appeared in the Health Affairs Blog. 

Categories: Uncategorized

3 replies »

  1. Thank you, Tanveerevan998, for your perfectly poetic response to the idea that hospitals endeavor to do that right thing.

  2. Don’t you think ratings of hospitals are going to have to be generated by groups of users outside the hospital?…who are not linked financially? Aren’t institutions always going to do whatever it takes to protect survival? Eg No hospital will release data on high post-op wound infection rate by its own volition.

    I think outside docs could rate hospitals. Or patients. Or internet sites like yelp. Or JCAHO. Or travel nurses.