Not really. But every day in America, the same number of people in American hospitals lose their lives because of preventable errors. They don’t die from their disease. They are killed because of hospital acquired infections, medication errors, procedural errors, or other problems that reflect the poor design of how work is done and care is delivered.
Imagine what we as a society would do if three 727s crashed three days in a row. We would shut down the airports and totally revamp our way of delivering passengers. But, the 100,000 people a year killed in hospitals are essentially ignored, and hospitals remain one of the major public health hazards in our country.
There are a lot of reasons for this, but I’d like to suggest that one reason is a terrible burden that is put upon doctors during their training and throughout their careers. They are told that they cannot and should not make mistakes. It is hard to imagine another profession in which people are told they cannot make mistakes. Indeed, in most professions, you are taught to recognize and acknowledge your mistakes and learn from them. The best run corporations actually make a science of studying their mistakes. They even go further and study what we usually call near-misses (but perhaps should be called “near-hits.” ) Near-misses are very valuable in the learning process because they often indicate underlying systemic problems in how work is done.
If you are trained to be perfect, it is very hard to improve.
David Rosen, an accomplished educator and administrator, and many years ago the Director of Education Services at Jobs for Youth in Boston, watched my TEDx talk and was prompted to say:
Your concern that doctors’ need to be perfect, to make no mistakes, leads me to a (just coined) adage: “perfect is an enemy of good…and also better.”
He goes further and discusses one of his former employees:
When I first worked with Mary she was a perfectionist. It was driving her crazy. As we were creating the JFY competency-based GED curriculum, one day she said “Does everything I write in this curriculum have to be excellent, or are there some things that just need to meet a more basic standard? I don’t think I can do everything perfectly. I need to know from you, as my supervisor, which things need to be excellent and which just need to pass.”
Mary taught me — as a wet-behind-the ears supervisor — everything I know about good supervision.
Let’s now take this a step further and consider the role of punishment in such an environment. At my former hospital, we had a case in which an orthopaedic surgeon mistakenly operated on the wrong leg of a patient. It was quite clear that the hospital’s “time-out” protocol, which was designed to avoid precisely this kind of error, had not been properly carried out. In the weeks following this disclosure, a number of people asked me if we intended to punish the surgeon in charge of the case, as well as others in the OR who had not adhered to that procedure. Some were surprised by my answer, which was, “No.”
I felt that those involved had been punished enough by the searing experience of the event. They were devastated by their error and by the realization that they had participated in an event that unnecessarily hurt a patient. Further, the surgeon immediately reported it to his chief and to me and took all appropriate actions to disclose and apologize to the patient. He also participated openly and honestly in the case review.
My reaction was supported by one of our trustees, who likewise responded, “God has already taken care of the punishment.” He pointed out that it would be hard to imagine a punishment greater than the self-imposed distress that the surgeon already felt. He had taken a professional oath to do no harm, and here he had, in fact, done harm. But another trustee said that it just didn’t feel right that this highly trained physician, “who should have known better,” would not be punished. “Wouldn’t someone in another field be disciplined for an equivalent error?” he asked.
This was a healthy debate for us to have, but a wise comment by a colleague made me realize that I was over-emphasizing the wrong point (i.e., the doctor’s sense of regret) and not clearly enunciating the full reason for my conclusion. The head of our faculty practice put it better than I had, “If our goal is to reduce the likelihood of this kind of error in the future, the probability of achieving that is much greater if these staff members are not punished than if they are.”
I think he was exactly right, and this was the heart of the logic shared by our chiefs of service during their review of the case. Punishment in this situation was more likely to contribute to a culture of hiding errors rather than admitting them. And it was only by nurturing a culture in which people freely disclose errors that the hospital as a whole could focus on the human and systemic determinants of those errors.
Paul Levy is the former President and CEO of Beth Israel Deconess Medical Center in Boston. For the past five years he blogged about his experiences in an online journal, Running a Hospital. He now writes as an advocate for patient-centered care, eliminating preventable harm, transparency of clinical outcomes, and front-line driven process improvement at Not Running a Hospital.