The Business of Health Care

Doctor, I’m Not Comfortable with That Order

A little more than 13 years ago, the Institute of Medicine (IOM) released its seminal report on patient safety, To Err is Human.

You can say that again. We humans sure do err.  It seems to be in our very nature.  We err individually and in groups — with or without technology.  We also do some incredible things together.  Like flying jets across continents and building vast networks of communication and learning — and like devising and delivering nothing- short-of-miraculous health care that can embrace the ill and fragile among us, cure them, and send them back to their loved ones.  Those same amazing, complex accomplishments, though, are at their core, human endeavors.  As such, they are inherently vulnerable to our errors and mistakes.  As we know, in high-stakes fields, like aviation and health care, those mistakes can compound into catastrophically horrible results.

The IOM report highlighted how the human error known in health care adds up to some mindboggling numbers of injured and dead patients—obviously a monstrous result that nobody intends.

The IOM safety report also didn’t just sound the alarm; it recommended a number of sensible things the nation should do to help manage human error. It included things like urging leaders to foster a national focus on patient safety, develop a public mandatory reporting system for medical errors, encourage complementary voluntary reporting systems, raise performance expectations and standards, and, importantly, promote a culture of safety in the health care workforce.

How are we doing with those sensible recommendations? Apparently to delay is human too.

We, of course, every year or so trot out campaigns and outrage about the patient safety problem.  We also have lots of programs and initiatives attempting to address safety — by institution — through federal agencies.  But we arguably have not credibly and systematically addressed the major recommendations in that report. We’re not even close.  And we still every single day have major safety problems in almost every aspect of U.S. health care.

For example, we do not have anything like mandatory reporting of misses and near misses.  We did get those Patient Safety Organizations (PSOs) through the Patient Safety Act of 2005.  PSOs are a loose network of designated entities scattered sporadically across the nation to help gather information on a voluntary basis about some adverse events.  They submit that data to a website operated by the Agency for Healthcare Research and Quality (AHRQ).  That site is called, somewhat ominously, the “PSO Privacy Protection Center.” It’s not clear what, if anything, happens with that information.  It is clear, though, that the primary concern seems to be about protecting the privacy of the information, rather than using it urgently to address safety.  More recently, AHRQ requested permission to run a pilot program that would facilitate consumer, as opposed to professional, reporting of medical errors.  That experimental program is still under consideration.

In 2009, the Robert Wood Johnson Foundation, through its Pioneer Portfolio, extended a two-year planning grant to a group interested in creating a public-private response to the health care safety challenge, similar to the Commercial Aviation Safety Team (CAST).  That group explored the possibility of creating a Public-Private Partnership to Promote Patient Safety (P5S).  As CAST does in aviation, the P5S would work to identify and mitigate safety hazards. The group found numerous barriers for such a health care partnership and so far has yet to find its national footing.  In health care, in spite of federal legislation and national attention, we nevertheless seem to be having a hard time even creating surveillance systems for reporting errors like the aviation industry has had for years—much less establishing collaborations to handle reported problems.

How about that “culture of safety”?  Have we aggressively pursued every possible avenue to ensure that health professionals, patients, and families feel comfortable and empowered to look for, find, talk about, and resolve safety problems?  Do most health professionals feel free to talk openly about mistakes and near misses with each other, as a team?  These questions are obviously rhetorical.  That’s unfortunate because this culture issue may be the linchpin to successful management of error in medicine.  We are collectively having a difficult time meeting these decade-old IOM recommendations, especially those requiring vast new data sources, reporting capability, and tricky collaborations.  Maybe we should instead look hard at the root of the problem—the human factor—our inherent propensity to err and the ways the professional culture handles that basic fact.

Former BIDMC CEO Paul Levy in several recent posts on his terrific Not Running a Hospital blog touches on these themes.  He focuses on Crew Resource Management (CRM), which is an approach to error prevention used in aviation that should have applicability in health care.  In those posts he cites an article that describes the use of CRM in the ICU.

Those authors note that,

“[i]n aviation, non-technical skills, a blame-free environment and Team Situational Awareness (SA) are considered CRM core competencies that require specific and focused training.”

Those same authors also observed:

“The archetypical medical specialist’s personality (highly motivated, A-type, control freak) helps in creating an environment in which a junior team member could feel inhibited to offer input in a senior team with ‘vertical’ leadership. This impacts Team SA, posing a threat to process safety, and thus patient safety.”

Their point, like the IOM’s, is that the human propensity to err is at the very core of our safety problems.

What if a large part of the answer to our safety challenge is not more and more layers of technical capability?  What if, instead, this challenge first and foremost requires the basics—like attention to team skills, composition, function, and training?  What if we worked hard to teach all health professionals and help all patients and families to be observant, assertive, and vocal about mistakes and potential mistakes?  What if we deliberately created enlightened clinical environments in which we embraced our human frailties, rather than worked so hard to deny them?

Although errors will always be part of our nature, they do not necessarily control our destiny.  Remember Pope didn’t just say, “To err is human.”  His full quote is important: “To err is human; to forgive, divine.”  It’s not so much the errors; we all make them.  Maybe it’s what we do together with those errors that ultimately matters most.

Michael W. Painter, JD, MD is the senior program officer at the Robert Wood Johnson Foundation.

Livongo’s Post Ad Banner 728*90

2
Leave a Reply

2 Comment threads
0 Thread replies
0 Followers
 
Most reacted comment
Hottest comment thread
2 Comment authors
Jack MoyeShirie Leng, MD Recent comment authors
newest oldest most voted
Jack Moye
Guest

What’s wrong with medical care in the US? It’s a business! The purpose of a business is to maximize income for the business entity. The purpose of medical care should be to maximize the health and wellbeing of the patient. Private and governmental tinkering with the system has produced an entity, which does neither but has created a boon to the insurance industry. Consider an earlier time in America when doctors were dedicated to the patient’s health and made accommodations for those who had difficulty paying. Hospitals, for the most part were owned and operated by the county and were… Read more »

Shirie Leng, MD
Guest

I think one of the big problems is that our patients expect that we are NOT human. They expect perfection. Fear of lawsuits prevents us from disclosing mistakes, especially those that have no impact on the patient. Systems issues, especially with regard to computer programs and meaningful use, are not correcting mistakes as they were expected to. And we actually have implemented a whole boatload of patient safety initiatives such as National Patient Safety Goals. In Surgery we now have time-outs before the time-outs. The result has mostly been more paperwork, not more safety. Patients are sicker, we have more… Read more »