What Business Can Learn From Cleveland Clinic: How To Report Quality To The Public

This summer I spent some time exploring how big teaching hospitals publicly report clinical outcomes to the public. For a given set of patients, how many live or die? And with what complications?

Patients can rarely find this information before getting elective surgery, or when deciding to commit to a given institution for a long-term course of treatment.

The problem is that right now there are few short-term incentives for hospitals to be transparent  to the public. Patients are used to finding care based on proximity, word-of-mouth, and referrals from trusted physicians. (None of these are bad methods, by the way.)

Meanwhile insurers and public programs rarely pay for better outcomes, so they do not build networks that steer patients to quality. Paternalism pervades the entire system, where insurers and providers alike do not trust patients to shop for the best care.

Thus it is only the most long-term focused institutions that decide to become radically transparent. And there’s one that stands out above the rest: Cleveland Clinic.

The Ohio institution is already known for excellent care, especially in cardiology, for being a “well-oiled machine”, and for being an economic bright spot in the otherwise dreary environs of Cuyahoga County. (Sorry, as  Pittsburgher it’s hard for me to say nice things about the Mistake By The Lake.)

But something else Cleveland Clinic should be known for is its public outcomes reporting. Every year since at least 2005 Cleveland Clinic has published Outcomes Books on its Web site. For each clinical category it releases data on mortality, complication rates, and patient satisfaction. It also mails paper copies of these books to specialists around the country as a kind of transparency-marketing. No other hospital system comes close to reporting this level of detail about the quality of its care.

A few thoughts in no particular order:

*The extent of Cleveland Clinic’s reporting is unique in the industry and must be very costly to the institution. The context might provide an explanation. The Cleveland Clinic health system is located in a shrinking population center. Cleveland shrunk more between the last two censuses than every metro except post-Katrina New Orleans. The organization has grown by providing high-quality medicine that attracts patients nationally and internationally, especially Canadians and Middle Easterners. It would be interesting to know how much outcomes reporting play a role here. If you look at Page 9 of the Heart & Vascular service’s Outcomes Book you’ll see that Cleveland’s mortality ratio for open-heart surgery is lower than at any of its unnamed competitors. If I were a millionaire in Toronto deciding where to pay cash for a bypass, I’d look favorably at that data.

*Outcomes reporting is a great service to the public but right now it’s so ahead of the curve that it’s hard to use. For example, if you read this Outcomes Book about Head & Neck surgery, there is a lot of wonky detail. For example: Page 16 shows a 70% five-year survival rate for stage IV carcinoma of the tongue. It’s difficult to objectively put that in context since few other hospitals do not report the same information.

*Yet, even if cross-hospital comparisons are difficult, the exercise of reporting is itself bound to increase quality. Think about its effect on staff. Physicians, nurses and administrators know that their outcomes will be visible to colleagues and rivals inside and outside the hospital. Internal units will compete not just on traditional measures–profitability, volume–but also the most important one: quality. That dynamic should be reassuring to potential patients. Your hospital should have enough confidence in its services to pull back the curtain on this kind of data.

*To be able to make comparisons among facilities, you have to standardize reporting and have an objective clearinghouse for it. In the world of finance the S.E.C., Morningstar and Bloomberg collect and package data for interested constituents, such as investors. In health care this has not happened yet, though there have been attempts. The nonprofit safety group Leapfrog collects infection rates, for example. Medicare reports on process measures. HealthGrades has terrific star ratings based on mortality and complication rates.

Cleveland Clinic and several other major institutions chose to stop reporting to Leapfrog last year, citing the burden of reporting to other sources. The lesson here is that there will be some jostling about who’s in control. And Leapfrog, which never excelled at packaging the data it collected, lost out. I think a better way to report would be to “over-report” to a central entity, like Medicare if it can get its act together (the S.E.C. of health care) and then have other entities make a market for processing that data, whether HealthGrades, Leapfrog, or new players (analogues to Value Line, Lipper, Bloomberg etc.).

There’s a lesson here for other businesses outside health care as well. Quality reporting is sporadic by auto companies, airlines, hotels, and other major industries. Regular self-reporting on how well cars hold up or how often planes leave on time would pay off. Customers who shop for quality appreciate accountability. Reporting would motivate managers internally to hold themselves to a high standard.

Cleveland Clinic is a rare example of a health care enterprise that’s more cutting edge than those in other industries.

David Whelan is a contributing editor at Forbes, where he was a staff writer for 8 years covering health care payers, providers and policy. He’s currently studying and working in hospital administration. Follow him on Twitter @WhelanHealth. This post first appeared on Forbes.