No One Is Perfect, Not Even Computers

My last post described how a precisely regimented dosage of intravenous medication delivered to me over six hours by a state-of-the art computer actually depended on the existence (and the survival for 6 hours) of a handwritten yellow Stickie hanging on my IV pole. I write this post as a recipient, certainly not a victim, since no harm occurred, of a “care error” caused by a computer.

After my first infusion I grumbled to my physician that it had taken 6 hours, and that the package stuffer the nurse gave me recommended about a 2 hour infusion for someone my weight and age. He was surprised but responded, “Those nurses are really good. They probably have more information about the drug. I would go with what they say.” So I called the Head Nurse in the Infusion Center. She told me that the infusion rates come from the computer. “How does the computer know them?”, I asked. She responded, “The Hospital Pharmacy Committee puts them in.” I called the Chief Pharmacist, noted the difference between the package insert and the computer recommendations, and asked him to review the information because I would sure like to spend just 2 hours off my boat rather than 6 for the next treatment. He contacted me a couple of days later to tell me that that medication infusion rate had been entered into the computer several years ago and was based on data from the one manufacturer of the medication. “There are now three manufacturers and two different concentrations. Each one has different infusion rates. Yours could go in over 2 hours. I will take care of updating the computer’s recommendations for your medication before the next treatment.”

Continue reading…

Leapfrogging CPOE

Leapfrog group

Last week, yet another alarming Computerized Physician Order Entry
(CPOE) study made headlines. According to Healthcare IT News, The Leapfrog Group, a
staunch advocate of CPOE, is now “sounding
the alarm on untested CPOE”
as their new study “points
to jeopardy to patients when using health IT”.
Up until now we had
inconclusive studies pointing to increased and also decreased mortality
in one hospital or another following CPOE implementation, but never an
alarm from a non-profit group who made it its business to improve
quality in hospitals by encouraging CPOE adoption, and this time the
study involved 214 hospitals using a special CPOE evaluation tool over a
period of a year and a half.

According to the brief Leapfrog
, 52% of medication errors and 32.8% of potentially fatal
errors in adult hospitals did not receive appropriate warnings (42.1%
and 33.9% accordingly, for pediatrics). A similar study published in the
April edition of Health
Affairs (subscription required)
, using the same Leapfrog CPOE
evaluation tool, but only 62 hospitals, provides some more insights into
the results. The hospitals in this study are using 7 commercial vendors
and one home grown system (not identified), and most interestingly, the
CPOE vendor had very little to do with the system’s ability to provide
appropriate warnings. For basic adverse events, such as drug-to-drug or
drug-to-allergy, an average of 61% of events across all systems
generated appropriate warnings. For more complex events, such as
drug-to-diagnosis or dosing, appropriate alerts were generated less that
25% of the time. The results varied significantly amongst hospitals,
including hospitals using the same product. To understand the
implications of these studies we must first understand the Leapfrog CPOE evaluation tool,
or “flight simulator” as it is sometimes referred to.

Continue reading…