Uncategorized

Interoperability: Faster Than We Think – An Interview with Ed Park

Screen Shot 2015-11-10 at 9.34.40 PM

Leonard Kish, Principal at VivaPhi, sat down with Ed Park, COO of athenaHealth, to discuss how interoperability is defined, and how it might be accelerating faster than we think.

LK: Ed, how do you define interoperability?

EP: Interoperability is the ability for different systems to exchange information and then use that information in a way that is helpful to the users. It’s not simply just the movement of data, it’s the useful movement of it to achieve some sort of goal that the end user can use and understand and digest.

LK:  So do you have measures of interoperability you use?

EP: The way we think about interoperability is in three major tiers. The first strata (1) for interoperability can be defined by the standard HL7 definitions that have been around for the better part of three decades at this point. Those are the standard pipes that are being built all the time.  So lab interoperability, prescription interoperability, hospital discharge summary interoperability. Those sort of basic sort of notes that are encapsulated in HL7.  The second tier (2) of interoperability we are thinking about is the semantic interoperability that has been enabled by meaningful use. The most useful thing that meaningful use did from an interop standpoint was to standardize all the data dictionaries. And by that I mean that they standardized the medication data dictionary, the immunizations, allergies and problems.

So prior to meaningful use, if you wanted to exchange medication information between two different systems… there are 28,000 medication formulations in the US… and you had to map each of those medications to every other medication in another system. So that turned out to be a completely intractable problem because you’d have teams of informaticists whose job it was to try to map from system to system. And the fact that we have standardized on that medication vocabulary, for example, has been extremely useful. That allows basically for CCDA-driven interoperability for the exchange of longitudinal, discrete data from system to system.

The level three (3) interop that we think about is platform level interoperability and that’s more akin to what you see in Silicon Valley. It’s where you open up your API’s to different systems out there. It’s not a ‘message passing’ type of interop, it’s a direct lightweight web service-based interoperability that actually allows you to directly graft new functionality onto the underlying platform. And that’s what you see when you see the different kinds of apps that are built on facebook or google or amazon. It’s that kind of API/platform level interoperability that we track.

LK: So how are apps currently developed by third parties on the athenahealth platform?

EP:  So anyone can go online to developer.athena.com and look at the API’s that have been published. In fact Grahame Grieve (creator of FHIR) asked about it. We gave him open access to the API’s that we’ve developed so he could take a look and decide if any of them are useful as he works with Argonaut and SMART on FHIR. 36 different folks that have actually built shrink wrapped integrations on top of our platform. But many more in pipe.

LK:  Do you have measures of how much data is flowing through these different levels of interoperability?

H: Our healthcare transactions team tracks a multitude of different things that are flowing across the athenaNet. Provider to provider, provider to hospital, information flowing from athenaNet to different health registries. It counts the number of API calls we have on the network.

We have 365 million standard transactions that went across athenaNet in the last month. I think that by itself is significant because we are building out what we call a “utility backbone”. The advantage to us over anyone else who’s building this up is that once we build a single global tunnel, then all of our clients can subscribe to it. So we actually have an advantage because we don’t have to build tons of points-to-points services.

We build it once to our cloud-based network and everyone on athenaNet can then connect to Happy Valley Lab. So we have 365 million standard transactions flowing through those pipes.  Which is good because otherwise they’d be faxes! But the ones that we track more closely are the move to the platform and this last month we had about 48 million API calls from platform partners. Today, we have a total of 65 partners who have access into those API’s and to over a thousand different ‘API endpoints’; different ways of accessing the data. So that’s about 48 million calls. And that for us is exciting because it represents a new and different way of thinking about interoperability.

LK: Has that grown significantly?

EP: It’s grown from near zero a few years ago.

LK: Many (on premises) vendors can’t collect this kind of data?

EP: Yeah, they would likely have no way of knowing.

LK: Tell me a little more about virtual networks and how do you see interoperability becoming competitive advantage in the next few years?

EP: The business of healthcare is changing very quickly and technology has to be an enabler and keep up with the business as opposed to technology being a driver. The idea of Kaiser Southern California putting together a partnership with Target and saying “Look, we want our Kaiser doctors to staff the Target clinics but we want to retain the longitudinal care for which Kaiser is famous” then runs into a technical obstacle. So at that point you get Epic and athenahealth together and say “What can we do to preserve longitudinal access to data” and we worked together to figure out how, and doing it.

Same thing with some of the community docs in the Yale/New Haven area…Yale/New Haven …runs Epic and a bunch of the community docs are on athenahealth but they want to preserve the idea of the longitudinal view of the care of the patient so they can track the patient from place to place. So we put together an integration with Epic and Yale/New Haven.

The strategic battlefront of healthcare is moving out of the hospital and toward retail clinics, community docs and urgent care chains.  You can see that in all of the numbers that are being put out there. The pendulum is actually swinging out there because it turns out the number one thing that patients care about is access to care. Two out of three, the thing they care most about is access. As that happens I think that the need to have interoperability, the need to be able to track a patient from the edges or the casual outside care back into the tertiary care centers is very important.

Similarly I think that as you begin thinking about the chronic patients out there, the 20% of patients that comprise 80% of the cost, that those patients also require some kind of virtual touchpoint because they will be accessing nodes of care that are not part of the traditional system. You add all of that stuff up and what we are seeing is that those systems that make interoperability a priority have a strategic advantage over those that don’t because the patients as consumers are accessing care in ways that are new and different and not owned by the health system anymore. Interoperability is becoming a strategic imperative in order to share patient information back to the system as new modalities emerge beyond the traditional health system.

Clinical data sharing is going to be necessary for patients to get closer to seamless care. For them to perceive seamless care you actually have to be able move the information from place to place and it has to feel like a partnership. It’s no good if they have to re-register and tell their health history again. So I think the competitive pressures are going to require that.

LK: Who is going to drive that? Is that going to be the health system, retail clinics, independent doctors or independent practices?

EP:  That’s a question I’ve been asking a number of folks. I spend about half my time in my field and I think the short answer is that everyone I talk to, from solo docs to the urgent and retail chains to the large secondary health systems to the large tertiary partner systems believe they have an urgent need to drive the change themselves.  Because everyone sees that that’s where the next battleground is and so everyone believes that it’s their responsibility. Which in some ways is good news because I think it means it’s going to happen.

LK: You’ve mentioned before that you see it as somewhat the vendor’s role to push clients toward interoperability. You can see it a little bit in your webinar because you’re sort of outlining the strategic road map and giving them the map and saying “this is where you need to go”. What other kinds of things are you doing to push your clients toward more interoperability?

EP: I think it’s part of our job to get the ball rolling down hill.  We want to get to the point where our clients take interoperability for granted.  Which means they should take for granted that they can exchange data with their referring or specialist’s practices, that they can exchange information seamlessly with labs, with hospitals. We want our clients to be able to innovate out there (telehealth or some chronic care management app, for example) and to be able plug it into the platform without having to wonder whether this new device or app will plug into the platform.

Our perspective is that we want to push our clients to take these for granted because there’s been a history of two decades of “unless it’s all built by the same vendor, it’s not going to work.”  We’re trying to get them to a new normal.  But that new normal, candidly, is no different than the normal that we all experience when we go to a bank or ATM and expect that it’s going to work with any ATM around the world. Or we use a new Apple product and we expect it to work seamlessly with Microsoft and Google. We expect that stuff to work. I expect that my Iphone will work well with our Microsoft Exchange server or with Gmail. That’s just an expectation that I think folks in other industries have by default that we don’t have by default in healthcare and that’s the thing we want to change.

LK:  Is value based payment really what’s driving all of this in this new direction or is it more than that?

EP:  I think value-based payment began to warm the tundra for this, but I think this need for patient access at this point is eclipsing that need and driving it faster.  There is a lot of this that did come from value based payment, so you did need to go with the standardized immunizations, allergies, problems, medications in order to be able to capture all the data necessary to be successful in an ACO. But I think in the last nine months that the move to patient access has actually been driving this harder. And I think the simple reason for that is economics. There’s not enough at stake financially for ACO’s, especially in these first few days of one sided ACO’s, to really radically change their behavior. On the other hand, I’ve seen health systems who have ED’s that had three urgent care centers open up within a one mile and the ED business dropped by 20% over night. And when that happens I think you have the attention of all the incumbents.

It’s one of the reasons that I think we’re going to see a massive acceleration in telehealth in the next year or two. It’s not that it’s all of a sudden a good new idea. It’s because there are real market forces in play today that are forcing a look toward new and innovative ways of creating patient access. I think those in turn will force both new kinds of interoperability between different players and new kinds of innovation. All of the different innovations in chronic care that have been put on the shelf over the last few years are coming off now and being put into practical use as health systems seek to create better attachment to their patients.

LK:  What do you see as the major roadblocks to interoperability?

EP: Two things I think.

One is that we have to continue to push forward the idea that it is politically and legislatively unacceptable to not be interoperable. I think it’s important that we continue the movement that interoperability is inevitable and that we all have to work aggressively toward that goal. Legislation and proposed legislation must make it clear that it’s not OK to pretend that you can live as an island unto yourself.

Point two is that the next advances in interoperability from a technical perspective will not and cannot be legislated. One of the analogies for interoperability, which I find to be a useful starting point, is the thread to a light socket. So everyone got together and decided that there’s one way to build a light socket and everyone conforms to that. Or that there’s one thread to a fire hydrant. That’s a famous example. “If everyone just standardizes the way that hoses connect to fire hydrants then we could have saved that city.” That’s a standard way of looking at interop and I think it leads to a perspective that if maybe we just define the standards well enough then everything will be fine.

But if you look at the history of silicon valley over the last ten years, at the Netflix API, the Google API, the Amazon API, the Apple API are unrecognizable from where they started.  And those API’s were not legislated, they were in fact built to appeal to a mass market of developers. So whoever actually built the best API ended up attracting most developers and ended up building more value onto their platform. Insofar as their business depended on the success of their platform, they ended up spending a lot of time trying to evolve their API in a market-oriented fashion to ensure that developers would keep developing on them. I think that is, in fact, what is happening in healthcare today. So the legislation and the standards were a good first cut at poking a hole in the dyke, but in order for the waters to rush through from my own personal perspective is that we’re going to have to encourage market forces to take their course in building truly usable API’s that allow for truly plug-and-play interoperability. Those two pillars are the way that I see interoperability moving forward.

LK:  Can we get to a patient centered record and what would it look like?

EP: The quote that keeps coming to mind is William Gibson’s quote, “The future is already here, it’s just unevenly distributed.”  I think that applies to healthcare. One – a patient centered record is inevitable and two, look at mint.com in the financial services industry.  Every financial services portal that you work with today already has some sort of way to electronically interact but it’s not a comprehensive view. So what Mint ended up doing was saying “Look, let’s find a way to aggregate all your information from across the financial institutions you work with in a way that’s actually personalized for you and helps you take better care of your financial health.” One path would be for health care to follow a similar track.

As different institutions begin to make their patient data available on different portals, those portals will end up being mechanized and at that point you’ll have enough infrastructure to create a single patient-centered record that takes into account your health information across all the different modalities of care that you visit, including your gym and other things that aren’t traditionally thought of as health care.

Part of the reasons we’ve seen the likes of Google Health fail in early incarnations is simply that the information that was resident in the EHRs was not, in fact, mechanically accessible for any price. And with the explosion of patient portals, the next logical step after that is “So now I have my information stuck in ten different portals, now what I do?” And that’s the time when you’re going to begin seeing consolidation. The step from information under desks on servers all the way to patient-centered records with all the data was just too long of a leap. But what I think what we’re seeing is a series of inevitable incremental steps that will get us exactly there.  So I think it’s inevitable, I think it’s a great thing to do and ensuring that we keep on that track is an important part of our mission. It’s all happening and it’s happening now.

9 replies »

  1. But patients are still seen as product, not customers, by all parties in the healthcare industry landscape. So we’re stuck in a loop …

  2. Yes, Dr. Palmer. I’m not even referring to all of the possible med device interfaces, just the RDBMS data dictionaries that vary from one vendor to another.

    We’re going to re-define “interoperability,” for one thing (take out the “without special effort” part; “special effort” will become the new workflow norm, it won’t be seen as “special”). Yeah, and “omics” data, jeez. I just finished reading two FDA tech papers on proposed stds. for genomic assay and data quality and clinical diagnostic expertise reliability.

    http://www.fda.gov/downloads/MedicalDevices/NewsEvents/WorkshopsConferences/UCM468521.pdf
    http://www.fda.gov/downloads/MedicalDevices/NewsEvents/WorkshopsConferences/UCM467421.pdf

    Better late than never, I suppose.

  3. Yes, Leonard but you don’t control 365 million transactions per month. APIs are a ping-pong between the vendors saying they have to do what their customers say and their customers saying the standards aren’t ready. Once either party decides to take responsibility for patient-controlled APIs the ping-pong stops because the other has to go along.

    athenaNet and Epic Everywhere are _both_ in a position to implement a patient-directed API. As national vendors, they do not have to wait for any particular client to approve. They can have a relationship directly with a patient or a physician. They could be the first ones to implement the HEART patient-directed profile.

  4. The problem is, Bobby, that we have all these machines that we have to hook up to the PMR, and each of these has its own API. Eg. how do we put the record of a cardiac echo on the EMR when it may have hundreds of still shots and dozens of short videos that, in aggregate, take 20 minutes to display?…And just reporting the summary really isn’t enough to transfer the info properly? Just think of all the computerized tools an ophthalmologist uses? Hooking all these up to the EMR would be a nightmare. They tell me this is the reason they are so hesitant to join the parade. Of course, putting the result of a GWAS–genome wide association study–would be almost silly.

  5. Personally, I love this idea. As I’ve said before, patient access to such data, I hope, will be considered a civil right. The individual still seems like the only reasonable (as in has the clear economic benefit to share) place to act as the hub to share this data. This is a critical question about the future of care and the future personal freedom and choice.

  6. It’s good to have a public discussion about healthcare APIs, THCB is a good place to have it, and Ed + Leonard are obviously well-qualified to anchor such a discussion. That said, the current post provides little actionable insight.

    Fortunately, have a particular action that we can debate around APIs for patient-directed health information exchange. The issue at hand, in the four major API workgroups: FHIR, Argonaut, SMART on FHIR, and HEART, is:

    Who controls access to the API?

    Right now, as the post makes clear, the APIs are controlled by those who buy technology. But the reality is that we have $1 Trillion of waste in healthcare https://hbr.org/2015/10/how-the-u-s-can-reduce-waste-in-health-care-spending-by-1-trillion and it’s unreasonable to expect the beneficiaries of that excess $Trillion to pay for interoperable systems that will introduce the transparency and substitutability that might reduce their revenue or market share.

    The issue of “who controls the API” is front and center in the HEART workgroup where we’we tackled the issue of who decides that an EHR or an app can connect to the API. If the patient is allowed to decide, then the API operator is effectively given a “safe harbor” under HIPAA patient right of access. This also eliminates a host of thorny privacy issues such as patient matching and consent management. This strategy, based on the User Managed Access standard, does not require the patient to manage a PHR. As we showed in the Privacy on FHIR pilot, patient-directed exchange can connect EHRs directly the way HIEs try to do. Halamka’s post http://geekdoctor.blogspot.com/2015/11/the-path-forward-for-meaningful-use.html has a good link to the current HEART profile proposal https://docs.google.com/document/d/1RchwNe8ddxnEzQihtivdUl9y-qfryjevyUEtGcnQ-eA/ for patient-directed exchange.

    The API discussion is now at a clear fork. Either the patient can take the responsibility (with appropriate “Black Box” warnings but no host restrictions) for access to the API, or control over who and what can connect to the API continues to rest with those who have the patient’s data along with an excess $Trillion of our money.

  7. Do you think the EMR and the EHR should be open to researchers? to health departments?, to the CDC? to the patients? to other payers like health districts and counties and CMS?
    What if you simply allowed all these legitimate users to access your data with telnet or SSH? Maybe there are ways to go that bypass the need to arrive at common dictionaries and compatible APIs?

  8. “The most useful thing that meaningful use did from an interop standpoint was to standardize all the data dictionaries.”
    __

    That is simply not true. Standard nomenclatures/vocabularies are not the same as “standard data dictionaries,” which come at the EHR architectural RDBMS metadata level. Data dictionaries continue to differ from one vendor to another. Because ONC never bothered to study the extent of the differences — by, say, requiring the submission of the database dictionaries as a condition of the MU certification application — we simply still don’t know the magnitude of the variability.

    The other thing Mr. Park leaves out of the IEEE interop definition clause “without special effort on the part of the customer.” By “defining interoperability down,” we really could simply declare victory and go home, given that virtually all mainstream EHRs have report-writing functionality that can burp out PDF and XML documents to send as secure attachments — i.e., “data exchange,” materially differing little from faxes.
    __

    “…the next advances in interoperability from a technical perspective will not and cannot be legislated. One of the analogies for interoperability, which I find to be a useful starting point, is the thread to a light socket. So everyone got together and decided that there’s one way to build a light socket and everyone conforms to that. Or that there’s one thread to a fire hydrant. That’s a famous example. “If everyone just standardizes the way that hoses connect to fire hydrants then we could have saved that city.” That’s a standard way of looking at interop…”

    “Will not be legislated.” Yeah, I buy that. But not the “cannot be” assertion; that’s a choice we’ve made. And, just to be clear, I’m not arguing that the feds would have to derive and publish a data dictionary standard themselves. But, WHERE is the Consensus Standards-Bodies “convening” leadership here? I see a lot of endless talk and slick 4-color 10 Year Plan report-writing, but little else.

    We missed the boat on that window of opportunity, I suppose. And, to be sure, there are myriad functional “industry consensus standards” out there across the breadth of industries and technologies. And, to riff on your “socket” analogy, if ONC promulgated an “interop standard” for household electricity, today you’d likely go to Lowe’s to choose from more than 2,000 sizes and shapes of “Stage 2 Certified 120 VAC 15 amp” wall sockets.

    Maybe APIs will be the HIT interop panacea. Maybe. I certainly hope so. But citing social media and other online consumer-facing interfaces obscures the reality that the typical incumbent ONC certified ambulatory EHR houses about 4,000 dictionary-defined variables within the schema, not a dozen or two.

    http://regionalextensioncenter.blogspot.com/2015/10/interoperability-we-dont-need-no.html

  9. This gives me a lotta hope – I’ve long thought that closed systems were going to wind up collapsing in on themselves. That’s true in any market sector, but healthcare has had a bucket on its collective head when it came to data liquidity (aka interop), which is why it is still partying like it’s 1995, with fax machines still prevalent “tech” in too many places.

    athenahealth seems perfectly positioned, given the work they’re doing on the API front, to become the utility backbone (to use their phrase) across the whole system, particularly in light of their ability to create pipes into/out of Epic, a la the Yale/New Haven project Ed mentioned in your interview.

    Reading this redeems Ed (a little bit) after his “patients aren’t asking for their data” comment on the main stage at Health Datapalooza earlier this year. Which, if you were in the room, you know drew some pretty loud “BOO!” noises, and not just from the Consumer Circle …

    If athenahealth wants to hit hyperdrive, they might want to create an epatient working group to help them get an anvil chorus of patients demanding data liquidity in care settings across the US. Rather than trying to talk the industry itself into interop, which is a long, slow slog of prying fax machines out of industry’s grasp. Just an idea …