Uncategorized

US Rumor and Hospital Report

It has been almost four years since I commented on the annual hospital ranking prepared by US News and World Report.  I have to confess now that I was relatively gentle on the magazine back then.  After all, when you run a hospital, there is little be gained by critiquing someone who publishes a ranking that is read by millions.  But now it is time to take off the gloves.

All I can say is, are you guys serious?  Let’s look at the methodology used for the 2011-12 rankings:

In 12 of the 16 [specialty] areas, whether and how high a hospital is ranked depended largely on hard data, much of which comes from the federal government. Many categories of data went into the rankings. Some are self-evident, such as death rates. Others, such as the number of patients and the balance of nurses and patients, are less obvious. A survey of physicians, who are asked to name hospitals they consider tops in their specialty, produces a reputation score that is also factored in.

Here are the details:

Survival score (32.5 percent). A hospital’s success at keeping patients alive was judged by comparing the number of Medicare inpatients with certain conditions who died within 30 days of admission in 2007, 2008, and 2009 with the number expected to die given the severity of illness. Hospitals were scored from 1 to 10, with 10 indicating the highest survival rate relative to other hospitals and 1 the lowest rate. Medicare Severity Grouper, a software program from 3M Health Information Systems used by many researchers in the field, made adjustments to take each patient’s condition into account.

Patient safety score (5 percent). Harmful blunders occur at every hospital; this score reflects how hard a hospital works to prevent six of the most egregious types. A 3 puts a hospital among the 25 percent of those that were best in this regard, a 2 in the middle 50 percent, and a 1 in the lowest 25 percent. Examples of the six kinds of medical episodes factored in are deaths of patients whose conditions should not have put them at significant risk and surgical incisions that reopen.

Reputation (32.5 percent). Each year, 200 physicians per specialty are randomly selected and asked to list hospitals they consider to be the best in their specialty for complex or difficult cases. A hospital’s reputational score is based on the total percentage of specialists in 2009, 2010, and 2011 who named the hospital. This year some physicians were asked to list up to five hospitals, the rest to list up to 10.

Other care-related indicators (30 percent). These include nurse staffing, technology, and other measures related to qualityof care. The American Hospital Association’s 2009 survey of all hospitals in the nation was the main source.

Let’s see how this pans out for one specialty, pulmonology. We see that the number 1 and 2 ranked hospitals have great reputations but the lowest score for patient safety.  The first hospital with “superior” safety rankings doesn’t appear until number 21.

The reputational data is opaque, as it has to be.  With great respect for the 200 pulmonologists who were surveyed, how much current data have they seen about the outcomes achieved by hundreds of hospitals and thousands of doctors around the country.  Answer:  None.  Why?  Because there is no current data published on such outcomes.  Likewise, there is no current data published about hospital related infections, falls, medication errors and other matters that could affect the treatment of a pulmonary patient, even if the pulmonologists are top-notch.

So, the reputational survey is likely to based on the following type of “information”:

Oh, I like Dr. Smith at ABC hospital.  We were in residency together 25 years ago.  He was a great guy.  I still remember that amazing Christmas party in 1986.

Or, maybe:

That Dr. Jones is at XYZ hospital is terrific. I heard him give a paper at the last meeting of the ATS (or ACCP, or AABIP.)  His Powerpoint presentation about his clinical successes (or research with mouse models) was gripping.

Or, maybe:

Dr. Pebble was trained by Dr. Stone, one of the best in the business in his day (40 years ago.)  That’s good enough for me.

Or, even:

I sent a really sick patient to Dr. Good at RST Hospital.  He saved her life.  It was a very tough case, and he deserves a lot of credit.

US News needs to stop relying on unsupported and unsupportable reputation, often influenced by anecdote, personal relationships and self-serving public appearances, and work on real — and more recent — data. Maybe that will also cause hospitals to be more willing to report their data so they can be named to the “Honor Roll.” As it is, you are better off keeping things opaque to protect your reputation.

I think it is time to acknowledge that this ranking offers very little in the way of valuable information.  It is mainly a vehicle for advertisements from the pharmaceutical industry, who know that this issue of the magazine gets a lot of attention and high circulation.  As you flip through to each specialty, you are blasted with ads for drugs related to syndromes within that specialty.  Here’s the top part of the pulmonology page:

Then, if you click through to “find resources about” a particular disease, you do get some nice content information, but you get sprayed with even more ads.

There would be no market for this magazine survey if the government or insurance companies did their job and displayed real-time clinical outcome data.  But those with the reputational advantage do not want that to happen.  And those who profit from the lack of data also have nothing to gain by a more open presentation of the actual record and qualifications of hospitals and doctors in each specialty.

Paul Levy is the former President and CEO of Beth Israel Deconess Medical Center in Boston. For the past five years he blogged about his experiences in an online journal, Running a Hospital. He now writes as an advocate for patient-centered care, eliminating preventable harm, transparency of clinical outcomes, and front-line driven process improvement at Not Running a Hospital.

4 replies »

  1. 私が仕事であったが、私のいとこは 25|アップルipadのiPad、それは30生き残ることができるかどうかを確認するためにテスト下垂足は、ちょうどので、彼女はYouTubeの感覚することができます。私と彼女は83景色を眺めることができます は今破壊さ壊れされている。私は、これは知っているオフトピック が、私は誰かとそれを共有していた!
    ugg ブーツ サイズ 24.5 http://mk-ball.org/images/UGG/20141107162103-67yc.html

  2. Nobody in the hospital executive suites thinks those rankings mean anything. They play along because they have to. They spend huge dollars to splash their USNWR ranking in advertising not because they think it is meaningful, but because the public does, and a real comparative advantage — what you might honestly use to distinguish your product — is nearly impossible for hospitals to articulate. (Like banks.)

    Every newspaper in the country has learned they can sell advertising to hot dog stands the week they rank the hot dog stands. We have a regional business newsletter that collects “data” on medical offices in our specialty and then ranks the group practices based on their one page survey. They also happen to offer discounts on advertising in that edition for medical group practices.

    All of the writers for these articles know very well they are intentionally blurring the line between journalism and sales.

  3. Excellent post. Anybody who closely examines the U.S. News methodology will realize that much of their survey rankings are based on dubious (“junk” might be a better word) science. Furthermore, I recall seeing at least a couple of peer reviewed published studies which found no statistical association between the U.S. News hospital rankings and rankings based on objective measures of health outcomes. I just wish the hospital industry would not cooperate with the U.S. News racket—it does not help when those hospitals that have received high rankings shamelessly plaster billboards all over their towns celebrating their ranking.