Back in the
‘stone ages’ when I (an MIT grad) was an intern, I was called at 4 AM to see
someone else’s gravely ill patient because her IV had infiltrated. I
started a new one and drew some blood work to check on her status. When
the results came back (on paper) I (manually) calculated her anion gap.
This is simple arithmetic but I had been up all night and didn’t do it right.
rounds the attending assured me that there was nothing I could have done anyway
but, of course, in other circumstances it could have made a difference and an
EHR could have easily done this calculation and brought the problematic result
to my attention. My passion for EHRs and FHIR apps to improve them really
traces back to this patient episode I will never forget.
My criticism of the recent Kaiser Health News and Fortune article Death by 1000 Clicks is generally not about what it says but what it doesn’t say and its tone.
The article emphasizes the undeniable fact that EHRs cause
new sources of medical error that can damage patients. It devotes a lot of ink
to documenting some of these in dramatic terms. Yes, with hundreds of vendors
out there, the quality of EHR software is highly variable. Among the major
weaknesses of some EHRs are awkward user interfaces that can lead to errors. In
fact, one of the highlights of my health informatics course is a demonstration
of this by a physician whose patient died at least in part as a result of a
poor EHR presentation of lab test results.
However, the article fails to pay equal attention to the
ways EHRs can, if properly used, help prevent errors. It briefly mentions that
around a 60% majority of physicians using EHRs feel that they improve quality. The
reasons quality is improved deserved more attention. The article also fails to
discuss some of the new, exciting technologies to improve EHR usability through
innovative third party apps and he real progress being made in data sharing
including patient access to their digital records.
Americans on average will visit a care provider about 300 times over the course of their lives. That’s hundreds of blood pressure readings, numerous diagnoses, and hundreds of entries into a patient’s medical record—and that’s potentially with dozens of different doctors. So it’s understandable, inevitable even, that patients would struggle to keep every provider up-to-date on their medical history.
This issue is compounded by much of our healthcare information being fragmented among multiple, incompatible health systems’ electronic health records. The majority of these systems store and exchange health information in unique, often proprietary ways—and thus don’t effectively talk with one another.
Fortunately, recent news from Apple points to a reprieve for patients struggling to keep all of their providers up-to-date. Apple has teamed with roughly a dozen hospitals across the country, including the likes of Geisinger Health, Johns Hopkins Medicine, and Cedars-Sinai Medical Center, to make patient’s medical history available to them on their phone. Patients can bring their phone with them to participating health systems and provide caregivers with an up-to-date medical history.
Empowering patients with the ability to carry their health records on their phone is great, and will surely help them overcome the issue of fragmented healthcare records. Yet the underlying standardization of how healthcare data is exchanged that has made this possible is the real feat. In fact, this standardization may potentially pave the way for innovation and rapid expansion of the health information technology (HIT) industry.
Paul Black is CEO of Allscripts and he’ll be with me at Health 2.0 on October 1-4. Paul has been CEO of Allscripts for about five years, taking over from Glen Tullman who grew the company aggressively by acquisition over the previous decade. Paul has been steering Allscripts through a pretty big transformation for the past few years, and they’ve been the major EMR vendor that has most aggressively reached out to the startup tech community. This is an edited transcript of an interview we had in late August. — Matthew Holt
Matthew Holt:Paul thanks for talking with me today, but also we’re going to have you on for a quick chat when you’ll be on the main stage at Health 2.0, of which Allscripts has been a great supporter. Your colleagues Tina Joros and Erik Kins have been there for many years but not you, so I ‘m thrilled to have you coming in early October. Paul, welcome!
Paul Black:Thank you very much. It’s a pleasure to be on the call with you today.
Matthew:Let’s dive in to the current state of play. There’s been some changes over the last five to seven years since the HITECH dollars came in, as more and more physicians, and more and more hospitals put in electronic medical records.
Obviously, Allscripts, was, I think, it’s right to say, built by Glen and Lee Shapiro via lot of acquisitions, especially with the Eclipsys purchase, with the goal of becoming a big player in that meaningful use world. And obviously, you have your old company, Cerner, and your friends from Wisconsin, Epic, who have been very dominant becoming a single platform for many large integrated delivery systems. Can you give me your sense of where the mainstream enterprise EMR market is at the moment?
Paul:I think that the mainstream EMR market in United States is becoming a mature market. And by that I mean it’s a marketplace in which almost every institution, almost every hospital, almost every post-acute facility, almost every ambulatory facility, has some semblance of an electronic medical record system. And certainly, they have an electronic billing set of capabilities. So, from that standpoint, almost everybody has something with regard to the ordering the management of and the documentation surrounding a clinical series of events.
Matthew: Give me a sense of how you think that’s changing in terms of the split between the integrated systems which are covering in-patients and out-patients, with physicians using the same system on both sides of the fences were, and the continued, I would say growth, but probably more accurately the continued existence of a large ambulatory-only segment of the market? After all that’s different for not only the way that the health systems and medical groups organize, but also the way that they’re served by organizations like you and Epic and many others. Is that system integration continuing or do you think that trend is kind of stopping?
Paul:I’ll take it from a couple of different angles. One is from an integration at the industry level, what has been vertical integration of large integrated delivery networks, or large multispecialty groups, especially practices, or in some cases, payers who are acquiring assets. I tend to see that while there was a lot going on over the course of the last four, five years, I’m starting to see people be more focused on what they’ve already acquired, and looking at operational efficiency and looking internally to ensure that they’re gleaning the expected returns, both clinically and financially, of the original goals of how they built those enterprises. That means from a culture standpoint, from an operation standpoint, and from a financial standpoint.
So, I don’t sense that there is as much of a, if you will, a go-go attitude to the continuation of acquisitions. I don’t think it’s necessarily been a conscious pause, but in some cases there’s been a lot of affiliations and acquisitions that have caused people to really have to make sure that they’ve done the things they need to do to really operationalize and to optimize the assets and the people that they are now a part of a new overall enterprise. I think from an industry standpoint of the people that serve that marketplace, us and some of the companies that you mentioned today, I see it’s just a natural progression of the other point that you’ve started with about where do we find ourselves in the state of the industry.Continue reading…
“We did not spend $35 Billion to create 5 data silos.” This was said by Vice President Biden at the beginning of Datapalooza on Monday and repeated by CMS’s Andy Slavitt on Tuesday. On Wednesday, at the Privacy and Security Datapalooza at HHS, I proposed a very simple definition of electronic health record (EHR) interoperability as the ability for patients and physicians to access independent decision support at the point of care regardless of what EHR system was being used.
Over the three days of Datapalooza, I talked to both advocates and officials about data blocking. In my opinion, current work on FHIR and HEART is not going to make a big dent in data blocking and would not enable independent decision support at the point of care. The reasons are:
Grahame Grieve is a long-time leader within HL7 and one of the key drivers behind FHIR. He chats with Leonard Kish about what’s been happening and what’s ahead for interoperability.
LK: First tell me how you got into standards… it’s kind of an odd business to get into. Why have you chosen this and why are you excited about it? G: It happened by accident. I was working for a vendor and we were tasked with getting some exchanges and I wanted them to be right the first time. That was the philosophy of the vendor. If we did it right the first time, then we wouldn’t have to keep revisiting and that meant that using the standards correctly. The more I got involved, the more I discovered that it wasn’t obvious how to do that…and that the standards themselves weren’t good. I felt personally that we need really good standards in healthcare. So it became a personal mission and I got more involved through the company I was working for and eventually I left so I could continue doing what I wanted doing with the standards – I enjoy the community aspect of the standards and feel very strongly that it’s worth investing time in and I had the opportunity to build a business out of it, which not many people do. So now I freelance in standards development and standards implementation. LK: There’s a lot of talk in Congress about the lack of interoperability and everyone probably has their own definition. Do you have a working definition of interoperability or is there a good definition you like for interoperability? G: The IEEE definition to get data from one place to another and use it correctly is pretty widely used. I guess when you’re living and breathing interoperability you’re kind of beyond asking about definitions. LK: Are there ways to measure it then? Some people talk about different levels; data interoperability, functional interoperability, semantic interoperability. Are there different levels and are there different ways to measure interoperability? G: We don’t have really have enough metrics. It’s actually relatively easy to move data around. What you’ve got to do is consider the costs of moving it, the fragility of the solution, and whether the solution meets the user’s needs around appropriateness, availability, security, and consent. Given the complexity of healthcare and business policy, it’s pretty hard to get a handle on those things. One thing that is key is that interoperability of data is neither here nor there in the end because if providers continue with their current work practices, the availability of data is basically irrelevant, because they treat themselves as an island. They don’t know how depend on each other. So I think the big open area is clinical interoperability. LK: Interoperability in other verticals mostly works. We hear talk about Silicon Valley and open APIs. There’s perhaps less commotion about standards, maybe because there are less conflicting business interests than in healthcare. Why is healthcare different? G: First of all – from an international perspective, I don’t think other countries are by and large better off or different (where incentives are different). They all have the same issues and even though they don’t have the business competition or the funding insanity that you do in the US, they still have the same fundamental problems. So I hear a lot of stuff from the US media about that and I think it’s overblown. The problem is more around micro level transactions and motivations for them and fundamentally the same problem around getting people to provide integrated clinical care when the system works against them doing that. LK: So can you give me an example of how things are maybe the same with NHS or another country vs. the US in terms of people not wanting to exchange clinical data? G: In Australia, there’s a properly funded medical health care system where the system is overwhelmed by the volume of work to be provided. No one get’s any business benefit from not sharing content with other people. Still, because you have to invest time up ahead to exchange data and other people get the benefits later, there’s very low participation rates for any kind of voluntary data sharing schemes that you set up. There’s scandalously low adoption rates. And that’s not because it’s not a good business idea to get involved but it’s because the incentives are misaligned at the individual level (and the costs are up front). LK: Right, so it’s maybe it’s also a lack of consumer drive? It’s there data and you’d expect the incentives to align behind them, but they don’t ask and don’t get, maybe because we (or our providers) only access your record when we really need them. It’s not like banking or email or other things we use on a daily basis? G: Probably that’s part of it, but from a consumer’s point of view, what does it do for them getting access to their data? Continue reading…
Every quarter, Health 2.0 releases a summary set of data that explains where industry funding is going, which product segments are growing fastest, and where new company formation is happening. Health 2.0’s precision and clarity when it comes to market segmentation and product information make this quarterly release the cream of the freebie crop.
The major news this quarter is that funding has slowed compared to this time last year, notwithstanding a significant bump from Allscripts’ $200M investment in NantHealth on the last day of the month. Yet, we’re still seeing growth in the Health 2.0 Source Database — both in number of products and companies. We also highlight the release of the Apple Watch, the growing momentum around FHIR, some key moves in the data analytics space, and the success of the latest Health 2.0 IPOs. For more, flip through below.
Currently, when healthcare data moves in this country it does it using fax machines and patient sneaker-nets. Automated digital interoperability is still in its earliest stages, mostly it has a history of being actively resisted by both the EHR vendors and large healthcare providers. We, as an industry, should be doing better, and our failure to do so is felt everyday by patients across the country.
Both of these plans ignore the lessons in execution from the previous strategic plan for health IT from ONC. The current Interoperability Roadmap mentions the “NwHIN” (Nationwide Health Information Network) for instance, and only covers what it accomplished, which are mostly policy successes like the DURSA (Data Use and Reciprocal Support Agreement). NwHIN was supposed to be a network of networks that connected every provider in the country… why hasn’t that happened?
ONC has forgotten what the actual ambition was in 2010. It was not to create cool policy documents. The plan 5 years ago was to have the “interoperability problem” solved in 5 years. The plan 5 years before that was probably to solve the problem in 5 years. Apparently, our policy makers look at interoperability and say “wow this is a big problem, we need at least 5 years to solve it”. Without any sense of ironic awareness that this is what they have been saying for decades, even before Kolodner was the ONC.
There has been much enthusiasm in the health IT industry regarding the health data standard that HL7 International is working on, HL7 FHIR, which is now a DSTU (draft standard for trial use). Everyone involved with health data – EHR vendors, interoperability vendors, medical app developers, “big data” proponents and hospital CIOs, to name a few – have high hopes that FHIR can be the golden ticket that leads to true health care interoperability.
Most of the enthusiasm is around the technologies being utilized in the standard including RESTful web services, JSON encoding, and granular data content called resources.
Technology-Empowered FHIR Data
RESTful web services, in particular, is a technology that has been strongly embraced by other industries and has the potential to be leveraged for engaging patients by connecting mobile technologies with their provider’s EHR system. This advancement represents a huge step toward building a patient-centered health care system.
Over the last decade, the healthcare industry has utilized SOAP-based web services to transfer documents. Most programmers today, if given their choice, would likely lean towards RESTful web services, preferably with data encoded in the JSON format. It is a better choice for mobile applications independent of whether the client device technology is iOS, Android, Windows, or even Mobile Web. Most social media sites today, such as Twitter and Facebook, publish RESTful APIs for connectivity.
This preference towards RESTful web services is based on some of the advantages that REST has over SOAP: