You can say that again. We humans sure do err. It seems to be in our very nature. We err individually and in groups — with or without technology. We also do some incredible things together. Like flying jets across continents and building vast networks of communication and learning — and like devising and delivering nothing- short-of-miraculous health care that can embrace the ill and fragile among us, cure them, and send them back to their loved ones. Those same amazing, complex accomplishments, though, are at their core, human endeavors. As such, they are inherently vulnerable to our errors and mistakes. As we know, in high-stakes fields, like aviation and health care, those mistakes can compound into catastrophically horrible results.
The IOM report highlighted how the human error known in health care adds up to some mindboggling numbers of injured and dead patients—obviously a monstrous result that nobody intends.
The IOM safety report also didn’t just sound the alarm; it recommended a number of sensible things the nation should do to help manage human error. It included things like urging leaders to foster a national focus on patient safety, develop a public mandatory reporting system for medical errors, encourage complementary voluntary reporting systems, raise performance expectations and standards, and, importantly, promote a culture of safety in the health care workforce.
How are we doing with those sensible recommendations? Apparently to delay is human too.