Now that the Obama administration and Congress have committed to spending billions of tax payers’ money on health IT as part of the economic stimulus package, it’s important to be clear about what consumers and patients ought to expect in return—better decision-making by doctors and patients.
The thing is, nobody can make good decisions without good data. Unfortunately, too many in our industry use data “lock-in” as a tactic to keep their customers captive. Policy makers’ myopic focus on standards and certification does little but provide good air cover for this status quo. Our fundamental first step has to be to ensure data liquidity – making it easy for the data to move around and do some good for us all.
We suggest the following three goals ought to be achieved by end of 2009:
- Patients’ clinical data (diagnoses, medications, allergies, lab results, immunization history, etc.) are available to doctors in 75% of emergency rooms, clinic offices, and hospitals within their region.
- Patients’ doctors or medical practices have a “face sheet” that lets any staff member see an all-up view of their relevant health data, including visit status, meds, labs, images, all of which is also viewable to patients via the Web.
- Every time patients see providers, they are given an electronic after-visit report that includes what was done and what the next steps for care will be according to best practices and evidence-based protocols, whenever these are applicable.
- Some who view this seemingly humble list of achievements will say that we can’t do it, because the standards aren’t ready, or the data is too complex. They’ll say that delays are necessary, due to worries about privacy or because too much data is still on paper.
We disagree. We believe that where there’s a will, there is going to be a way. And we already know most of what we need to know to achieve these goals. We know that:
- Huge amounts of digital data exist, already formatted electronically, but scattered across many proprietary systems (meds, labs, images).
- Software and the Internet makes it possible—in a low cost, lightweight way—to get data out of these databases to the point of decision making (to the ER doctor, the patient/consumer, or the primary care physician).
- People are hungry for information in whatever form they can get it:.
-
- Getting it on paper is better than nothing
- Getting it quickly is better than getting it late
- Getting it in non-standard digital format is better than paper (software is pretty good at transforming non-standard to standard formats)
- Getting it in a standard format is better
- Getting it in a structured, standard format is best
An integration “big bang” — getting everybody all of a sudden onto one, single, structured and standard format — can’t and won’t happen.
We don’t have to wait for new standards to make data accessible — we can do a ton now without standards. What we need more than anything else is for people to demand that their personal health data are separated from the software applications that are used to collect and store the data.
This idea of separating health data from the applications is very important, and a better way to frame the discussion about how to achieve data liquidity than is the term “interoperability,” which we find cumbersome and opaque. Smart people, armed with software, can do incredible things with data in any format – so long as they can get to it.
Customers of health information systems want to re-use their health data, and in ways they haven’t always thought of or anticipated. However, many enterprise system vendors make it difficult or expensive to get access to the data — to separate it from the application. They believe that proprietary “lock-in” allows them some form of strategic advantage.
We understand that IT vendors are in business, and need to create strategic value for their products. And we are very much in favor of that — in rules, in workflow, in user experience, price and flexibility, and so on. However, vendors should not be able to “lock” the patient or enterprise data into their applications, and thereby inhibit the ability of customers and partners to build cross-vendor systems that improve care.
It’s possible for vendors to provide value without the need for lock-in. There are lots of examples of this, for example, the Health Information Exchange in Wisconsin and CVS MinuteClinic. In the former, value is clearly being added immediately to users in the ED, without requiring all the participating EDs to change their systems or to be standards compliant (or CCHIT certified). At MinuteClinics, summary after-visit health data are made available to customers online using the Continuity of Care Record standard. This is where the low hanging fruit is.
There’s already a proven model for extracting and transforming data in many ways – HL7 feeds, non-HL7 feeds, web services, database replication, XML and XSLT, and more – and along the way wecan create value by interpreting the data and adding metadata. Microsoft is doing it today– both in the enterprise with Amalga and and across enterprises to the consumer with HealthVault. We hope other vendors follow this lead to drive better outcomes for patients.
Unlike the physical world where there is a need for dejure standards—think railroad tracks—in the software world, there is much more flexibility and the standards that work are the ones that evolve from USAGE and market acceptance. The certification and standards road equals conferences, press releases, “connectathons”, caregivers-turned-bureaucrats. The outcomes road equals immediate benefits to actual caregivers AND learning we can apply to the next round, and the next, and the next.
We have given the industry decades to make this happen — and just in the last 1-2 years have people finally gotten fed up and just started moving. Our great risk here is that the people lobbying for dollars and certification today are the people who are invested in the old road. With the amount of money we are talking about, we run the risk of just giving them another decade to delay and plan. Instead, let’s put the dollars into rewarding behavior and outcomes, and let the people who live with the problems every day figure out how to solve them.
When we set out to go to the moon in the 1960’s we didn’t say “let’s build a great rocket.” So, too, in this case we shouldn’t say “let’s buy a great IT system.” Our measurements should be tied to what we want – better care, informed by the data that is just out there waiting for us to use it.
David C. Kibbe MD MBA is a Family Physician and Senior Adviser to the American Academy of Family Physicians who consults on health care professional and consumer technologies. Peter Neupert is Health Solutions Group Corporate Vice President at Microsoft.
Categories: Uncategorized
Excellent post.
Why not let doctors and patients determine if systems are useful, affordable, worth it?
Instead of mandating certification, the government could very simply publish standards, much like what has been done with HL7, etc.
Certification does not equal interoperability.
“On the other hand are consumers clamoring for PHRs?”
This is likely to change over time. Most consumer/patients are unaware of the existence of PHRs and of their potential value to them regarding saved time (=money), lower charges paid, fewer errors, better quality and coordination of care and so on.
“Until now hospitals have no business case to share the information and in fact disincentives since duplicated tests are a huge cash cow.”
This is true. Those receiving income from the “wasted” (aka non-value-adding) activities in medical services of course oppose change that eliminates that income.
The resistance from physicians themselves to digitization of clinical data is a mixture of factors: unwillingness to invest in learning and training required, fear of computer technology, tribalistic attitude towards tools that are imposed from outside the “tribe”, etc.
The financial return to medical service deliverers is positive, substantially so in some cases, for well-implemented systems, as are the returns to “customer service” and to improvement in quality of service.
Interoperability is critical and CCHIT-certification is the way to go forward. But, here is the rubb. There is a need for a unique patient identifier that gets around privacy regulations.
The issue is to link meaningful data in a way to do analysis. My understanding is that the privacy regulations (although important) present a barrier to an achievable goal.
In October, 2008, RAND put out a report that details the need for a unique patient identifier. This is a critical legislative priority that needs to be taken seriously by Congress.
http://www.rand.org/pubs/monographs/MG753/
Lets recap. The last 8 years the country had an administration that encouraged the private sector to solve problems and we failed to implement HIT. Now some of the biggest advocates of that model are complaining about the lack of progress?
No one has prevented the private sector from solving the problem of interoperability in the last decade. Vendors like Epic have a vested interest actually in not allowing systms to share data as well as technical challenges (yes we know how complicated it is to share once you customize the app) but it is stilly to have the same vendor at 3 hospitals in the same cities that cant share data.
People make rational business choices and the problem is that the vendors “clients” are hospital systems not the payers. Until now hospitals have no business case to share the information and in fact disincentives since duplicated tests are a huge cash cow. People do exactly what we pay them to do and the private sector failed to solve the problem.
On the other hand are consumers clamoring for PHR’s? Nope. In those systems that have them they use them to make appts and get their labs but recent studies at Group Health in Colorado show that it doesn’t reduce the over all number of visits. Do you have one and how has it change your care?
I would be curious to see the conversation between Brian and John Tooker (Chair of the Board of the NeHC) the next time they are at a meeting with one another or Peter and Rick Ratcliff (sure scripts and CCHIT)at the next HIMSS conference.
Analysis exactly on target.
“An integration “big bang” — getting everybody all of a sudden onto one, single, structured and standard format — can’t and won’t happen.”
True. Let’s hope the powers-that-be in the Obama Administration realize this.
“There’s already a proven model for extracting and transforming data in many ways – HL7 feeds, non-HL7 feeds, web services, database replication, XML and XSLT, and more – and along the way we can create value by interpreting the data and adding metadata”
All the elements already exist to exchange data through the techniques listed here, primarily the use of the HL7 protocol and web services, an area where Microsoft, among many other software developers, has benn active.
David, et al.
This is such smart thinking. Clear, realistic and simplistic (relative to forcing complexity to find value) structured standards can be employed NOW, to achieve the goal of data liquidity. I think we tend to over think and over complicate basic needs and a certification such as CCHIT is a nonessential stick that has no proven impact in moving us forward on the ‘outcomes road’. David, your electronic after-visit report (a carrot not a stick) is exactly what can help extend the clinician-patient encounter and relationship, potentially adherence and outcomes–of course, all which need to be measured. I think a W3C would be extraordinarily valuable to promote structured, standardized formatting. Great idea, once again.
Bravo…lets do it…Time to empower and educated physician champions and leaders across specialties, nationwide to push implementation of ‘basic’ standards as above, by year end. It is a huge barrier to breakthrough though, consider the 1000 physician group within my practice area that will be months to years behind implementation of a ready-to-go EHR/PM solution, partly due to MD pushback (ie change is ‘difficult’ and ‘scary’) and due to overly lofty goals of requiring a near-complete solution upfront. The recommendations for stepwise information sharing should be implemented first, then can easily be built as needed for each practice/system. Think of the efficiency and cost savings alone with such ‘simple’ first steps. Let me know how I can help!
Margalit: Thank you for your kind comments. Actually, I’m a lot more sanguine about the possibility of a W3C like consortium forming sometime in 2009 than I’ve ever been. It’s a good idea, and might occur organically if a preponderance of IT vendors are willing to take responsibility to cooperate and collaborate, instead of trying to dominate the market with proprietary code. The “new road” forward will take innovation, but most of all we have to agree that change would be a good thing for all.
Regards, DCK
Fantastic!! This is by far the sanest most compelling article I read on this blog (could be because I totally agree :-)).
The interoperability, certification, enterprise software band wagon seems to be unstoppable. I didn’t think the small voices advocating caution could be heard over the loud CCHIT cheers. As I wrote here, and elsewhere, before this entire CCHIT certification process is a clear and present danger to innovation. In the current interoperability festivities, the Emperor is really, really naked.
If we must “certify” something, let us certify, or define and maintain standards for data exchange and terminology.
We should create a consortium like W3C that maintains the standards and maybe even provides services like validation, documentation and so much more to vendors and individuals that are attempting to use those standards in their software. If it worked for Web standards on a global scale, it should work for our tiny industry even better. Maybe we can fund that as part of this HIT bonanza.
And you are right, software developers will take it from there….. that is, if they are allowed to…