Categories

Tag: US News and World Report

Potential Bias in U.S. News Patient Safety Scores

flying cadeuciiHospitals can get overwhelmed by the array of ratings, rankings and scorecards that gauge the quality of care that they provide. Yet when those reports come out, we still scrutinize them, seeking to understand how to improve. This work is only worthwhile, of course, when these rankings are based on valid measures.

Certainly, few rankings receive as much attention as U.S. News & World Report’s annual Best Hospitals list. This year, as we pored over the data, we made a startling discovery: As a whole, Maryland hospitals performed significantly worse on a patient safety metric that counts toward 10 percent of a hospital’s overall score. Just three percent of the state’s hospitals received the highest U.S. News score in patient safety — 5 out of 5 — compared to 12 percent of the remaining U.S. hospitals. Similarly, nearly 68 percent of Maryland hospitals, including The Johns Hopkins Hospital, received the worst possible mark — 1 out of 5 — while nationally just 21 percent did. This had been a trend for a few years.

Continue reading…

Secrets to Choosing the Right Medical School

GundermanThe competition to get into medical school is fierce.  The Association of American Medical Colleges just announced that this year, nearly 50,000 students applied for just over 20,000 positions at the nation’s 141 MD-granting schools – a record.  But medical schools do not have a monopoly on selectivity.  The average student applies to approximately 15 schools, and many are accepted by more than one.  Students attempting to sort out where to apply and which admission offer to accept face a big challenge, and they often look for guidance to medical school rankings.

Among the organizations that rank medical schools, perhaps the best-known is US News and World Report (USNWR).  It ranks the nation’s most prestigious schools using the assessments of deans and chairs (20%), assessments by residency program directors (20%), research activity (grant dollars received, 30%), student selectivity (difficulty of gaining admission, 20%), and faculty resources (10%).   Based on these methods, the top three schools are Harvard, Stanford, and Johns Hopkins.

Rankings seem important, but do they tell applicants what they really need to know?  I recently sat down with a group of a dozen fourth-year medical students who represent a broad range of undergraduate backgrounds and medical specialty interests.  I posed this question: How important are medical school rankings, and are there any other factors you wish you had paid more attention to when you chose which school to attend?

Continue reading…

Why Transparency Doesn’t Work.

The Cleveland Clinic is by far the best provider of cardiac care in the nation. If you have cancer there is no better place to be than Texas. Johns Hopkins is the greatest hospital in the America.

Why? Because US News and World Report suggests as much in its hospital rankings.

But which doctors at the Cleveland Clinic have the highest success rates in aortic valve repair surgeries? What are the standardized mortality rates due to cancer at University of Texas MD Anderson Cancer Center? Why exactly is Johns Hopkins the best?

We don’t have answers to these types of questions because in the United States, unlike in the United Kingdom, data is not readily available to healthcare consumers.

The truth is, the rankings with which most patients are familiar provide users with little. Instead, hospitals are evaluated largely by “reputation” while details that would actually be useful to patients seeking to maximize their healthcare experiences are omitted.

Of course, the lack of data available about US healthcare is not US News and World Report’s fault – it is indicative of a much larger issue. Lacking a centralized healthcare system, patients, news sources, and policy makers are left without the information necessary for proper decision-making.

While the United Kingdom’s National Health Service may have its own issues, one benefit of a system overseen by a single governmental entity is proper data gathering and reporting. If you’re a patient in the United Kingdom, you can look up everything from waiting times for both diagnostic procedures and referral-to-treatment all the way to mortality and outcome data by individual physician.

This is juxtaposed to the US healthcare system, where the best sources of data rely on voluntary reporting of information from one private entity to another.

Besides being riddled with issues, including a lack of standardization and oversight, the availability of data to patients becomes limited, manifesting itself in profit-driven endeavors like US News and World Report or initiatives like The Leap Frog Group that are far less well-known and contain too few indicators to be of real use.

The availability of data in the United Kingdom pays dividends. For example, greater understanding of performance has allowed policy makers to consolidate care centers that perform well and close those that hemorrhage money, cutting costs while improving outcomes.  Even at the individual hospital level, the availability of patient data keeps groups on their toes.

Continue reading…

Why Do Academic Medical Centers Do Poorly on Quality Report Cards?

In September 2012, the Joint Commission recognized 620 hospitals (about 18% of the total number of accredited American hospitals) as “top performers,” but many were surprised when some of the biggest names in academic medical centers failed to make the cut.  Johns Hopkins, Massachusetts General Hospital, and the Cleveland Clinic (perennial winners in the US News & World Report best hospital competition) did not qualify when the Joint Commission based their ranking not on reputation but on specific actions that “add up to millions of opportunities ‘to provide the right care to the patients at American hospitals.’”

The gap between the perceived reputation of America’s “best” hospitals and medical schools and their performance on an evidence-based medicine report card provides an interesting lens through which to understand the role and performance of America’s academic medical centers in the 21stcentury.

The most pressing challenge for American medicine has been summarized in the triple aim:  how to cut the per-capita cost of healthcare, how to increase the quality and experience of the care for the patient, and how to improve the health and wellness of specific populations.

Can we expect academic medical centers to lead the country in meeting the challenge?  If history is any guide, the answer may be no.  In a 2001 article titled “Improving the Quality of Health Care:  Who Will Lead?” the authors write:

“We see few signs that academic medical leaders are prepared to expend much effect on health care issues outside the realms of biomedical research and medical education.  They exerted little leadership in what may arguably be characterized as the most important health policy debates of the past thirty years:  tobacco control, health care cost containment, and universal access.”

Having been a professor at several medical schools (UCSF, University of Iowa, Allegheny University of the Health Sciences, and Michigan State), I learned early on that the key to academic advancement was NIH funded basic science research.  While lip service was paid to the ideal triple threat professor (great clinician, superb teacher, and peer reviewed published investigator), the results of the tenure process clearly resulted in a culture where funded research counted far more than teaching and clinical care delivery.

Continue reading…

US Rumor and Hospital Report

It has been almost four years since I commented on the annual hospital ranking prepared by US News and World Report.  I have to confess now that I was relatively gentle on the magazine back then.  After all, when you run a hospital, there is little be gained by critiquing someone who publishes a ranking that is read by millions.  But now it is time to take off the gloves.

All I can say is, are you guys serious?  Let’s look at the methodology used for the 2011-12 rankings:

In 12 of the 16 [specialty] areas, whether and how high a hospital is ranked depended largely on hard data, much of which comes from the federal government. Many categories of data went into the rankings. Some are self-evident, such as death rates. Others, such as the number of patients and the balance of nurses and patients, are less obvious. A survey of physicians, who are asked to name hospitals they consider tops in their specialty, produces a reputation score that is also factored in.

Here are the details:

Survival score (32.5 percent). A hospital’s success at keeping patients alive was judged by comparing the number of Medicare inpatients with certain conditions who died within 30 days of admission in 2007, 2008, and 2009 with the number expected to die given the severity of illness. Hospitals were scored from 1 to 10, with 10 indicating the highest survival rate relative to other hospitals and 1 the lowest rate. Medicare Severity Grouper, a software program from 3M Health Information Systems used by many researchers in the field, made adjustments to take each patient’s condition into account.

Continue reading…

assetto corsa mods