Categories

Tag: Patient Safety

My Patient’s Keeper

Six years ago, my husband saved my life.

I had a severe allergic reaction to a medicine in the hospital in the middle of the night; he ran for the nurse. As for me, despite being a doctor myself, I couldn’t even breathe, let alone call for help. And so, even before and certainly since, I advise my patients not to be alone in the hospital if they can help it. I don’t even think anyone should be alone for office visits. There is too much opportunity to misunderstand the doctor, forget to ask the right questions, or misremember the answers.

National organizations like the American Cancer Society give the same advice: when possible, bring a friend.

As a patient safety researcher and an advocate for high quality healthcare, however, I find giving this advice distasteful. Is a permanent sidekick really the best we can do to keep patients safe? What about those who are already vulnerable because they don’t have such a superhero in their lives, or that superhero just has to punch in at some inflexible job?

Let’s take another look at the circumstances that ended up with my husband shouting, panic-stricken, in the hallway. The medicine I was given is known to cause severe allergic reactions. It is so well-established, in fact, that the standard protocol for giving this medication is to give a small test dose first. It was the test dose that nearly did me in. The hospital followed standard procedure by giving me the test dose. But they chose to do it at midnight, when the hospital is staffed by a skeleton crew, even though the medicine wasn’t urgent. Strike one for safety.Continue reading…

Is the Patient Safety Movement in Critical Condition?

These should be the best of times for the patient safety movement. After all, it was concerns over medical mistakes that launched the transformation of our delivery and payment models, from one focused on volume to one that rewards performance. The new system (currently a work-in-progress) promises to put skin in the patient safety game as never before.

Yet I’ve never been more worried about the safety movement than I am today. My fear is that we will look back on the years between 2000 and 2012 as the Golden Era of Patient Safety, which would be okay if we’d fixed all the problems. But we have not.

A little history will help illuminate my concerns. The modern patient safety movement began with the December 1999 publication of the IOM report on medical errors, which famously documented 44,000-98,000 deaths per year in the U.S. from medical mistakes, the equivalent of a large airplane crash each day. (To illustrate the contrast, we just passed the four-year mark since the last death in a U.S. commercial airline accident.) The IOM report sparked dozens of initiatives designed to improve safety: changes in accreditation standards, new educational requirements, public reporting, promotion of healthcare information technology, and more. It also spawned parallel movements focused on improving quality and patient experience.

As I walk around UCSF Medical Center today, I see an organization transformed by this new focus on improvement. In the patient safety arena, we deeply dissect 2-3 cases per month using a technique called Root Cause Analysis that I first heard about in 1999. The results of these analyses fuel “system changes” – also a foreign concept to clinicians until recently. We document and deliver care via a state-of-the-art computerized system. Our students and residents learn about QI and safety, and most complete a meaningful improvement project during their training. We no longer receive two years’ notice of a Joint Commission accreditation visit; we receive 20 minutes’ notice. While the national evidence of improvement is mixed, our experience at UCSF reassures me: we’ve seen lower infection rates, fewer falls, fewer medication errors, fewer readmissions, better-trained clinicians, and better systems. In short, we have an organization that is much better at getting better than it was a decade ago. Continue reading…

Connecting Medical Devices and Their Makers

Today, an intensive care unit patient room contains anywhere from 50 to 100 pieces of medical equipment made by dozens of manufacturers, and these products rarely, if ever, talk to one another. This means that clinicians must painstakingly review and piece together information from individual devices—for instance, to make a diagnosis of sepsis or to recognize that a patient’s condition is plummeting. Such a system leaves too much room for error and requires clinicians to be heroes, rising above the flawed environment that they work in. We need a heath care system that partners with patients, their families and others to eliminate all harms, optimize patient outcomes and experience and reduce waste. Technology must enable clinicians to help achieve those goals. Technology could do so much more if it focused on achieving these goals and worked backwards from there.

This week marks a step that holds tremendous promise for patients and clinicians. On Monday the Masimo Foundation hosted the Patient Safety Science & Technology Summit in Laguna Niguel, California, an inaugural event to convene hospital administrators, medical technology companies, patient advocates and clinicians to identify solutions to some of today’s most pressing patient safety issues. In response to a call made by keynote speaker former President Bill Clinton, the leaders of nine leading medical device companies pledged to open their systems and share their data.

Lack of interoperability between medical devices plays no small role in the 200,000 American deaths caused by preventable patient harm each year, such as in the case of 11-year-old Leah Coufal. After undergoing elective surgery, Leah received narcotics intended to ease her pain.

When Leah received too much medication, it suppressed her breathing, eventually causing it to stop altogether. Had she been monitored, a device could have alerted clinicians when Leah’s breathing slowed to a dangerous level.

But as we know, clinicians are busy and unfortunately don’t always respond to alarms from bedside machines. If a machine measuring her breathing had been linked with the device delivering her medication, it could have automatically stopped the drugs from infusing into her blue, oxygen-deprived veins.

All of this is possible today; technology is not a barrier. Until now, the only thing that’s stood in the way is a lack of leadership and a lack of willingness for device manufacturers to cooperate.

Continue reading…

The Empowered Patient

When you or a loved one enters a hospital, it is easy to feel powerless. The hospital has its own protocols and procedures. It is a “system” and now you find yourself part of that system.

The people around you want to help, but they are busy—extraordinarily busy. Nurses are multi-tasking. Residents are doing their best to learn on the job. Doctors are trying to supervise residents, care for patients, follow up on lab results, enter notes in patients’ medical records and consult with a dozen other doctors.

Whether you are the patient or a patient advocate trying to help a loved one through the process, you are likely to feel intimated—and scared.

Hospitals can be dangerous places, in part because doctors and  nurses are fallible human beings, but largely because the “systems” in our hospitals just aren’t very efficient.  In the vast majority of this nation’s  hospitals, a hectic workplace undermines the productivity of  nurses and doctors who dearly want to provide coordinated patient-centered care.

At this point, many hospitals understand  that they must streamline and redesign how care is delivered and how information is shared so that doctors and nurses can work together as teams. But this will take time. In the meantime, patients and their advocates can help improve patient safety.

Continue reading…

Doctor, I’m Not Comfortable with That Order

A little more than 13 years ago, the Institute of Medicine (IOM) released its seminal report on patient safety, To Err is Human.

You can say that again. We humans sure do err.  It seems to be in our very nature.  We err individually and in groups — with or without technology.  We also do some incredible things together.  Like flying jets across continents and building vast networks of communication and learning — and like devising and delivering nothing- short-of-miraculous health care that can embrace the ill and fragile among us, cure them, and send them back to their loved ones.  Those same amazing, complex accomplishments, though, are at their core, human endeavors.  As such, they are inherently vulnerable to our errors and mistakes.  As we know, in high-stakes fields, like aviation and health care, those mistakes can compound into catastrophically horrible results.

The IOM report highlighted how the human error known in health care adds up to some mindboggling numbers of injured and dead patients—obviously a monstrous result that nobody intends.

The IOM safety report also didn’t just sound the alarm; it recommended a number of sensible things the nation should do to help manage human error. It included things like urging leaders to foster a national focus on patient safety, develop a public mandatory reporting system for medical errors, encourage complementary voluntary reporting systems, raise performance expectations and standards, and, importantly, promote a culture of safety in the health care workforce.

How are we doing with those sensible recommendations? Apparently to delay is human too.

Continue reading…

Is the Readmissions Penalty Off Base?

I’ve been getting emails about the NY Times piece and my quotation that the penalties for readmissions are “crazy”.  Its worth thinking about why the ACA gets hospital penalties on readmissions wrong, what we might do to fix it – and where our priorities should be.

A year ago, on a Saturday morning, I saw Mr. “Johnson” who was in the hospital with a pneumonia.  He was still breathing hard but tried to convince me that he was “better” and ready to go home.  I looked at his oxygenation level, which was borderline, and suggested he needed another couple of days in the hospital.  He looked crestfallen.  After a little prodding, he told me why he was anxious to go home:  his son, who had been serving in the Army in Afghanistan, was visiting for the weekend.  He hadn’t seen his son in a year and probably wouldn’t again for another year.  Mr. Johnson wanted to spend the weekend with his kid.

I remember sitting at his bedside, worrying that if we sent him home, there was a good chance he would need to come back.  Despite my worries, I knew I needed to do what was right by him.  I made clear that although he was not ready to go home, I was willing to send him home if we could make a deal.  He would have to call me multiple times over the weekend and be seen by someone on Monday.  Because it was Saturday, it was hard to arrange all the services he needed, but I got him a tank of oxygen to go home with, changed his antibiotics so he could be on an oral regimen (as opposed to IV) and arranged a Monday morning follow-up.  I also gave him my cell number and told him to call me regularly.

Continue reading…

The Good Doctor

Dr. Brian Goldman is right.

We expect a level of perfection from our doctors, nurses, surgeons and care providers that we do not demand of our heroes, our friends, our families or ourselves. We demand this level of perfection because the stakes in medicine are the highest of any field — outcomes of medical decisions hold our very lives in the balance.

It is precisely this inconsistent recognition of the human condition that has created our broken health care system. The all-consuming fear of losing loved ones makes us believe that the fragile human condition does not apply to those with the knowledge to save us. A deep understanding of that same fragility forces us to trust our doctors — to believe that they can fix us when all else in the world has failed us.

I am always surprised when people say someone is a good doctor. To me, that phrase just means that they visited a doctor and were made well. It is uncomfortable and unsettling — even terrifying — to admit that our doctors are merely human — that they, like us, are fallible and prone to bias.

They too must learn empirically, learning through experience and moving forward to become better at what they do. A well-trained, experienced physician can, by instinct, identify problems that younger ones can’t catch — even with the newest methods and latest technologies. And it is this combination of instinct and expertise that holds the key to providing better care.

We must acknowledge that our health care system is composed of people — it doesn’t just take care of people. Those people — our cardiologists, nurse practitioners, X-ray technicians, and surgeons — work better when they work together.

Working together doesn’t just mean being polite in the halls and handing over scalpels. It means supporting one another, communicating honestly about difficulties, sharing breakthroughs to adopt better practices, and truly dedicating ourselves to a culture of medicine that follows the same advice it dispenses.

Continue reading…

State of the EHR Nation

In a time of EHR naysayers, mean-spirited election year politics, and press misinterpretation (ONC and CMS do not intend to relax patient engagement provisions), it’s important that we all send a unified message about our progress on the national priorities we’ve developed by consensus.

1. Query-based exchange – every country in the world that I’ve advised (Japan, China, New Zealand, Scotland/UK, Norway, Sweden, Canada, and Singapore) has started with push-based exchange,replacing paper and fax machines with standards-based technology and policy. Once “push” is done and builds confidence with stakeholders, “pull” or query-response exchange is the obvious next step. Although there are gaps to be filled, we can and should make progress on this next phase of exchange. The naysayers need to realize that there is a process for advancing interoperability and we’ll all working as fast as we can. Query-based exchange will be built on top of the foundation created by Meaningful Use Stage 1 and 2.

2. Billing – although several reports have linked EHRs to billing fraud/abuse and the recent OIG survey seeks to explore the connection between EHR implementation and increased reimbursement, the real issue is that EHRs, when implemented properly, can enhance clinical documentation. The work of the next two years as we prepare for ICD-10 is to embrace emerging natural language processing technologies and structured data entry to create highly reproducible/auditable clinical documentation that supports the billing process. Meaningful Use Stage 1 and 2 have added content and vocabulary standards that will ensure future documentation is much more codified.

3. Safety – some have argued that electronic health records introduce new errors and safety concerns. Although it is true that bad software implemented badly can cause harm, the vast majority of certified EHR technology enhances workflow and reduces error. Meaningful Use Stage 1 and 2 enhance medication accuracy and create a foundation for improved decision support. The HealtheDecisions initiative will bring us guidelines/protocols that add substantial safety to today’s EHRs.
Continue reading…

“Unaccountable” An Important, Courageous and Deeply Flawed Book

In his new book, Unaccountable: What Hospitals Won’t Tell You and How Transparency Can Revolutionize Health Care, Johns Hopkins surgeon Marty Makary promises a “powerful, no-nonsense, nonpartisan prescription for reforming our broken health care system.” And he partly delivers, with an insider’s and relatively unvarnished view of many of the flaws in modern hospitals. Underlying these problems, he believes, is an utter lack of transparency, the sunshine that could disinfect the stink.

The thesis is important, the honesty is admirable, and the timing seems right. Yet I found the book disappointing, sometimes maddeningly so. My hopes were high, and my letdown was large. If your political leanings are like mine, think Obama and the first debate.

Makary hits the ground running, with the memorable tales of two surgeons he encountered during his training: the charming but utterly incompetent Dr. Westchester (known as HODAD, for “Hands of Death and Destruction”) and the misanthropic “Raptor,” a technical virtuoso who was a horse’s ass. Of course, all the clinicians at their hospital knew which of these doctors they would see if they needed surgery, but none of the patients did. (Of HODAD, Makary writes, “His patients absolutely worshipped him… They had no way of connecting their extended hospitalizations, excessive surgery time, or preventable complications with the bungling, amateurish, borderline malpractice moves we on the staff all witnessed.”)

This is compelling stuff, and through stories like these Makary introduces several themes that echo throughout the book:

1) There are lots of bad apples out there.

2) Patients have no way of knowing who these bad apples are.

3) Clinicians do know, but are too intimidated to speak up.

4) If patients simply had more data, particularly the results of patient safety culture surveys, things would get much better.

Continue reading…

Improving Patient Safety Through Electronic Health Record Simulation

Most tools used in medicine require knowledge and skills of both those who develop them and use them. Even tools that are themselves innocuous can lead to patient harm.

For example, while it is difficult to directly harm a patient with a stethoscope, patients can be harmed when improper use of the stethoscope leads to them having tests and/or treatments they do not need (or not having tests and treatments they do need). More directly harmful interventions, such as invasive tests and treatments, can harm patients through their use as well.

To this end, health information technology (HIT) can harm patients. The direct harm from computer use in the care of patients is minimal, but the indirect harm can potentially be extraordinary. HIT usage can, for example, store results in an electronic health record (EHR) incompletely or incorrectly. Clinical decision support may lead clinician astray or may distract them with unnecessary excessive information. Medical imaging may improperly render findings.

Search engines may lead clinicians or patients to incorrect information. The informatics professionals who oversee implementation of HIT may not follow best practices to maximize successful use and minimize negative consequences. All of these harms and more were well-documented in the Institute of Medicine (IOM) report published last year on HIT and patient safety [1].

One aspect of HIT safety was brought to our attention when a critical care physician at our medical center, Dr. Jeffery Gold, noted that clinical trainees were increasingly not seeing the big picture of a patient’s care due to information being “hidden in plain sight,” i.e., behind a myriad of computer screens and not easily aggregated into a single picture. This is especially problematic where he works, in the intensive care unit (ICU), where the generation of data is vast, i.e., found to average about 1300 data points per 24 hours [2]. This led us to perform an experiment where physicians in training were provided a sample case and asked to review an ICU case for sign-out to another physician [3]. Our results found that for 14 clinical issues, only an average of 41% of issues (range 16-68% for individual issues) were uncovered.

Continue reading…

Registration

Forgotten Password?