For the majority of my career I have been obsessed with creating technologies to modernize our largely dysfunctional U.S. healthcare system. To me, it is very clear that the emergence of cloud computing has finally created the opportunity to truly address this daunting problem. Cloud-based solutions are the only viable option for effectively getting providers, patients and other key stakeholders online so that the necessary efficiencies find their way into the system.
To the rest of healthcare IT, however, it is not so clear, as witnessed by the lack of truly cloud-based companies in the marketplace.
Most of the large, established players in this industry continue to rely on the outdated client/server or older technologies, such as MUMPS. Some of these companies’ products trace their roots as far back as 1969. These companies and their software were built before the world wide web, before Facebook, the iPhone and iPad, salesforce.com – and even email, for God’s sake! There also exists a tremendous amount of confusion related to the morass of small, bootstrapped EMR companies, which number in the hundreds. People do not understand the difference between buying a monolithic single-purpose app to utilizing a robust, cloud-based platform approach.
This lack of understanding has made me realize that we need a better way to explain what the cloud has the power to do, and what true cloud-based technology even is. Easier said than done!
I was recently afforded a breakthrough, though unfortunately at the expense of an ancient treasure. Allow me to explain:
Why “half the cost?” How? Most important, what does it mean for hospitals and health systems? Here’s the argument, and some of the implications.
In 1980, health care in the United States took no more of a bite out of the economy than it did in any other developed country. Then we instituted cost controls. By 2000, U.S. health care cost twice as much as everyone else’s. By 2020 or 2025, we may be back to costing the same as any other country — half the current cost in GDP.
Historical charts of the comparative cost of health care in different countries show a startling and obvious pattern. The trend lines of the leading economies form a fairly tight pack, drifting slowly upward from around 5 percent of GDP in 1960 to 8 percent to 10 percent in recent years — except for one. Around 1980, the U.S. trend line sharply breaks from the pack, and quickly establishes itself at half again as much as most other leading economies, then twice as much.
This happened over the very period that Medicare, followed by private health plans, instituted increasingly stringent and widespread unit cost controls.
I draw two conclusions from this: The notion that U.S. health care must cost twice as much as everyone else’s is not exactly the law of gravity. And there is no evidence that unit cost controls actually control system costs. In fact, through a series of complex feedback mechanisms, it may well be that controlling unit costs pushes up system costs, as members of the system find ways to increase their prices and the numbers and acuity of their utilization patterns despite the caps on reimbursements for individual items.
Working in the health care space has forced me to give up many hopes and expectations that I had a few years ago. Forgive me for being cynical (it’s an easy feeling to have following the country’s largest health IT conference, as I reported a month ago), and indeed some positive trends do step in to shore up hope. I’ll go over the redeeming factors after listing the five tough lessons.
1. The health care field will not adopt a Silicon Valley mentality
Wild, willful, ego-driven experimentation–a zeal for throwing money after intriguing ideas with minimal business plans–has seemed work for the computer field, and much of the world is trying to adopt a “California optimism.” A lot of venture capitalists and technology fans deem this attitude the way to redeem health care from its morass of expensive solutions that don’t lead to cures. But it won’t happen, at least not the way they paint it.
Health care is one of the most regulated fields in public life, and we want it that way. From the moment we walk into a health facility, we expect the staff to be following rigorous policies to avoid infections. (They don’t, but we expect them to.) And not just anybody can set up a shield outside the door and call themselves a doctor. In the nineteenth century it was easier, but we don’t consider that a golden age of medicine.
Instead, doctors go through some of the longest and most demanding training that exists in the world today. And even after they’re licensed, they have to regularly sign up for continuing education to keep practicing. Other fields in medicine are similar. The whole industry is constrained by endless requirements that make sure the insiders remain in their seats and no “disruptive technologies” raise surprises. Just ask a legal expert about the complex mesh of Federal and state regulations that a health care provider has to navigate to protect patient privacy–and you do want your medical records to be private, don’t you?–before you rave about the Silicon Valley mentality. Also read the O’Reilly book by Fred Trotter and David Uhlman about the health care system as it really is.