Uncategorized

ONC Signals a Shift From Documents to Interfaces

flying cadeuciiAll of you Meaningful Use and Health IT junkies should read Data for Individual Health  Although long, it’s definitely worth a scan by everyone who cares about health tech. This is the third JASON-related report in a year out of ONC and it comes a month or so before the planned release of the first details of ONC’s announced 10yr plan. I think there’s a reason for that much of it introduced by ONC’s earlier post.

There are three key points I would highlight:

First, and most important, this report suggests that HIPAA Covered Entities (mostly hospitals, doctors and their EHRs) are no longer the center. The future, labeled as the Learning Health System, now makes mobile and patient-centered technology equally important as part of the architecture and talks about interoperability with them rather than “health information exchange” among HIPAA CE’s and their Meaningful Use mandates.

Second, this JASON report, unlike the previous two, does not talk about Meaningful Use any more. That money is spent. A lot of orgs are lobbying against any more MU mandates and, although I’m pretty sure there will be a Stage 3, it could be toothless or very much delayed.

Third, Direct, the original Blue Button, Blue Button Plus Push, and CCDA files are pretty much history.  Although the JASONs don’t say it as plainly as I am, document-based interoperability has failed and we’re moving on to Application Programming Interfaces (APIs) that don’t use CCDA or any of the stuff mandated by MU 1 and 2. Blue Button Plus Pull and FHIR, both with a modern industry-standard OAuth security scheme, are the future for all sorts of good reasons which you need to read the JASON reports with some care to understand. It’s all there.

I’m personally not sure Direct will survive the end of the MU incentives and the high cost and address space fragmentation introduced by DirectTrust trust bundles and other failing governance efforts. The future is likely to be founded on web industry standards rather than health vendor standards and be much more respectful of both privacy and cybersecurity as a result. Blue Button may well survive as the portal standard for manual download but the user experience for PHRs will be vastly superior with FHIR and PHRs that survive the transition to APIs are likely to deprecate Direct.

This shift to FHIR will be most challenging for Health Information Exchanges. When every hospital, doctor, and mobile app exposes a secure API using strong, general purpose credentials, what do we need HIEs for?

This shift from documents to APIs will take 12-24 months. It will be interesting to see what ONC does with MU3 to either enhance the shift and keep MU relevant a bit longer or not and simply let it rest in peace.

Either way, ONC should be congratulated for moving as quickly as it did with the JASON process.

46 replies »

  1. If you try to correct it, you could possibly
    add to the harm. eval(ez_write_tag([[300,250],’brighthubengineering_com-medrectangle-2′]));.
    With a good system available at your service, you can access all websites and remain in touch with your system without any tension.

  2. Puppies can be wonderful pets for all ages, but unfortunately there are
    places where you cannot let your puppy out to play without worrying whether they will get into
    mischief or not. “If you can just lose 20 pounds,” he tells you, “then you’ll feel great about yourself. Playing sports, exercise or whatever activity you involve yourself into should have that comfort ability to just enjoy.

  3. Adrian,

    You and I have had our disagreements…

    …but I have to say that what you’ve written here is brilliant in it’s simplicity, straightforwardness and insight.

  4. I tend to agree with Scott Mace http://www.healthleadersmedia.com/page-1/TEC-311081/Argonaut-Project-is-a-Sprint-Toward-EHR-Interoperability The standards can be ready for trials and pilots in 2015. The business and political realities, however are the dominant factors.

    The government can, if it chooses, implement FHIR and related secure access standards aggressivey among the systems they control including Medicare, VA, DoD, even the federal health insurance exchanges. The government can also fund open source reference implementations of various components of the Learning Health System and make them available to everyone. This can all be done starting today and there’s no obvious reason it could not be fully operational throughout government-controlled systems within two years.

    That would be a nice way to set the pace, don’t you think?

  5. There are three separate threads of skepticism above: hype, standards, and what the feds can do.

    Those who consider the Public API hype are missing the essential difference between documents and interfaces. Messaging interfaces, including old HL7, CCDA documents, and Direct email are all very difficult to integrate because errors cannot be managed in real time. As the Web has shown with HTTP(S), when you try to separate out the security layers, get your errors on the spot, and allow real time client processing and redirection, the cost of integration plummets.

    Standards cannot be imposed by the feds or anyone else. Standards do evolve faster if the commercial interests are aligned. For hardware-bound standards, like WiFi and Bluetooth, the software / protocol pieces come together very quickly. When the standards and protocols are core to the lock-in business model of the EHR vendors and their large hospital network customers, interoperability standards are an existential threat. The only solution in these cases is open source software for everything the standard touches. It’s not that all EHRs have to be open source (that will happen in time as doctors and patients come to insist on non-secret, peer-reviewed technology) it’s just that enough EHRs have to be open source to establish the standards in a part of the industry where vendor and patient lock-in is not part of the business model. The VA has led the way in this space. Let’s hope DoD joins the parade. Dr Chen, above and his open source EHR is the future.

    Third, the most constructive thing government can do is to fund open source reference implementations of everything related to the standards. That cannot change the lock-in business realities but it can directly drive toward a de-facto standard. This was well demonstrated 20 years ago with the introduction of DICOM into radiology imaging.

  6. Great post Adrian. I wonder what the RWJF and AHRQ will fund next though? My initial observation was on the 12-24 months time frame being achievable. I’d love to know your thoughts on this time frame please Adrian?

  7. Good.

    “You’re not sending data to a singular entity that owns the keys to an API that they created.”

    Well, yeah, I understand that. That’s what “pubic API” means. But there are approximately 4,000 variables in a typical ambulatory EHR, each with its own data dictionary and RDBMS schema. How many vars will require cross-mapping for useful, clinically functional interop? How much time will this take you, total (including QA testing and verification — the latter per 45 CFR 164.312(c)(1) and (2) — “authenication”)? What is the cost of that time?

    Then, every time HL7 comes out with an updated FHIR API standard release, will (at least some of) your interface code have to be updated as well? It’s good that you have programming chops. You are in a very small minority of docs in that regard. Those skills will be required of someone in every instance, every install. FHIR is not download/plug & play out of the box. If this type of thing becomes a routine EHR vendor responsibility, so much the better. That’s where the responsibility should reside anyway.

    Just don’t expect it to be free.

  8. As I’m writing this, I’m already coding my PHP based, web-based EHR to interact using documented FHIR APIs. There is no charge since all the documentation is right there, as a public API should be. You’re not sending data to a singular entity that owns the keys to an API that they created. It’s an API language that if all health care IT innovators who really care about interoperability, this is what we all need to learn and adopt. Yes, it’s still being developed but it’s a very doable framework, and I started creating RESTful interfaces for my EHR in less than 1 week. It’ll be posted on my GitHub site soon.

  9. “Sounds like we are at the beginning of the Gartner hype cycle all over again.”
    __

    See my comment link from above.

    https://thehealthcareblog.com/blog/2014/12/07/onc-signals-a-shift-from-documents-to-interfaces/comment-page-1/#comment-702872

    The HL7 FHIR API still has to have data ex/im mapping code written by SOMEONE out of and into each of the sending and recieving EHRs (a fact complicated by our refusal to require an EHR Standard Data Dictionary). If it’s just “out of the sending EHR,” then at the destination “inbox” you have an imported XML type document that has to reside somewhere separately, and cannot be integrated into the recipient’s EHR RDBMS system absent manual transcription (whether keystroked or screen-scraped, or Dragon’d where possible) or via custom in-house coding. Absent database assimilation, it’s essentially a fax by any other name, not materially different from dumping a report module/progress note out to a PDF, encrypting and sending it as a secure email attachment.

    Are we to assume that ONC will at some point REQUIRE that EHR vendors write bi-directional FHIR data map code into their systems as a condition of CHPL certification? (As is not the case today with data export CCD’s and/or CCR’s.) That the “not-for-profit” HL7® will benevolently “license” these FHIR® APIs free of charge (and work with each vendor to write the maps), in the charitable interest of the long-sought national “interoperability standard”?

  10. Very good questions and extremely valid concerns. If a public API is adopted and utilized, this may make all closed-source, proprietary systems be placed on notice that they cannot rely on market share alone to survive. They must innovate or die.

    This is where open source solutions, such as mine, will be able to get a good foothold in these once closed-off legions, especially when MU legislation further enclosed these legacy technologies and hampered true interoperability. A community-based, peer-reviewed forum such as an open source EHR project that harnesses the power of this public API to address safety, efficacy, usability in a transparent and accountable way is ultimately the way to go. And a public API that has no extreme certification costs (either it works or it doesn’t) that would drown a non-profit, open source project will further support an innovative ethos and environment that has been stifling Health IT for too long.

  11. I agree with you, Hayward, that improvements health IT alone will not fix the spiraling health care costs in the US and it truly lies in prevention. Those on the front lines (myself included) in outpatient primary care, have been embattled and decimated both from the perverse incentives of the FFS payment model as well as the MU legislation that have essentially created a digital divide between those that can afford costly, unusable, and unwieldy EHRs and those that have sit out on the sidelines, using the old paper/pen/fax model, but biting the bullet either by opting out of Medicare altogether or going into a Direct Practice model.

    What I’m excited about with these developments that Adrian has posted, however is an opportunity for a public API (FHIR, by far has the most traction and it’s clearly documented), using existing industry standards (RESTful, OAuth) to be the cornerstone for secure transaction of health data from the patient to the provider and vice versa with the patient ultimately having control over their data.

    The implications for primary care physicians may not be obvious, but this is what I see. Open source projects could easily be developed that take advantage of RESTful architecture, spurring innovation at minimal cost to the provider user. No longer will the provider be at the mercy of legacy, closed-source, proprietary technologies, bringing the leap across the digital divide much more palatable. In fact, several providers who have balked at jumping to an open source project (such as mine) may be more willing to because now there is no incentive to stick with a costly, unwieldy, EHR when all the relevant health care data is no longer siloed in their EHRs to begin with.

    When Public APIs become available and adopted en masse, the idea of decentralization of health care data, may well be the norm. I believe that will be well received by healthcare providers who are concerned about health care data security that’s held in the cloud by one or a very few entities.

    The end point here, and if it gets moving fast enough, is that the lower cost of entry may help these embattled primary care physicians back into the digital fold and with the idea that they can start over and be a proactive participant in the development of an open-source (peer reviewed) EHR project (like mine) that fits their workflow rather than being forced to take in a bitter pill (like MU was handled); we may start to see improvements in health care costs as more and more primary care doctors begin enjoying medicine again without being shackled to poorly adaptive and unsafe technologies.

  12. There is so much hype around the Public API (aka FHIR). No real-world experience. Is it really easier? Does it scale? Sounds like a very chatty interface to me. What liability do i have if I do not pull all the related data? Here we go again. Sounds like we are at the beginning of the Gartner hype cycle all over again.

    I hope that we have learned from C-CDA and Direct that we need real-world trial implementations with a feedback loop before we start yet another national experiment with the Public API.

  13. There is no question that the cost of healthcare in the US is twice what it should be and the quality is worse than other industrialized nations.

    I agree that the answer is in the data, and the data from the European countries, and studies done in this country, suggest that the solution is to spend more money at the beginning of life and on preventative healthcare. we already have the data on this issue, we need to implement the solution.

    Of course, I am not opposed to health information technology, it clearly serves a role in my practice and I could not live without it, but I do not think it is the solution to a reduction in the cost of health care of the US; that solution lies elsewhere. And the data would suggest that the “elsewhere” is to emulate the European model of spending more money on early childhood care.

  14. Hopefully. Patient data is just data and has no particular uses. The same patient data can be used for public health, research, marketing, health care, billing, and policy making. The patient data is non-rivalrous, meaning that with appropriate privacy (read transparency) safeguards, all of the uses of the data can be accommodated without compromise.

    The Public API concept would mean that all of the patient data, regardless of use, would,move over the same pipe and with full transparency for the patient it pertains to. In 2014, this is eminently doable.

  15. Hayward, it is widely accepted that the U.S. healthcare system is wasting $1 Trillion / year compared with any other “developed” nation. We, the people, have almost zero transparency of quality or cost and there is no accountability for why. Without data, the redirection of resources toward the youngest and most vulnerable citizens will be based solely on politics and the waste will continue.

    The Public API concept is simple: All data moves over the same pipe and all transactions are digitally visible and accountable to the patient. Once the bits are accessible, what we do with them will evolve with or without regulation.

  16. I have to agree in saying this is overall one of the first positive developments in some time. Could it be that focus could turn the medical record back to being used for medical purposes (as opposed to billing and/or big data uses)? I am still quite cautious to be optimistic, however, seeing how previous developments that I favored have turned into vectors by which the system less focused on the patients and more on the system itself. I am still skeptical that government regulators can starve out the big money players (health IT now being one of them) they have fed so luxuriously up to now. What is the political driving force behind this change? It’s very hard for this former EHR evangelist to believe that those who have subverted the positive potential of EHR into the bureaucrat-pleasing systems they now are, would have a “Road to Damascus” change and now work for a truly better system. I can hope, but I won’t yet get hopeful.

  17. I guess the medical fax is here to stay…

    As part of Meaningful Use Stage I certification, certified EHRs were required to read CCRs and C-CDAs as well as create either CCRs or C-CDAs.

    As part of Meaningful Use Stage II certification, certified EHRs were required to read and generate C-CDAs and interact with Direct email.

    Now it appears CCRs are gone, while C-CDA and Direct email may be phased out in favor a new and better standard called FHIR.

    If the Federal government and health information technology “experts” plan on revising the standards for health information exchange every time a new/better standard is created, it will guarantee the fax remains the preferred mode of communications used by physicians.

    As a developer of an electronic medical record program, I can personally attest to the fact that large amounts of money were spent trying to accommodate Federal EHR Certification Standards. Obviously, that money was wasted and innovation was delayed, as there are a limited amount of resources which can be directed towards either “improving” an EHR or revising the EHR to meet Federal standards.

    Although a recent study (Usage and Effect of Health Information Exchange: A Systematic Review Ann Intern Med. 2014;161(11):803-81) found that health exchange information has an impact on the care delivered in the emergency room, the article clearly demonstrates that we have yet to see definitive proof that the cost arising from creating the technology which will support the universal exchange of health information has a salutary effect on the quality and cost of healthcare that is commensurate with the expense incurred. Numerous articles have demonstrated that, although HIT has some effect on the quality of the healthcare system, I strongly believe that there is not hard evidence to prove that the benefits of HIT outweigh the cost of HIT. Thus, I must conclude that those HIE dollars would have a much greater return on investment if the money was directed toward early life preventative healthcare, as suggested by the Robert Wood Johnson Foundation (Time to Act: Investing in the Health of Our Children and Communities January 2014)

    I understand that the health information technology experts will decry a decrease in HIT spending,. However, it needs to be remembered that a large fraction of the HIT community have a vested interest to continue HIT spending, as do the vendors of EHRs and HIE. While the HIT has valuable information to contribute to our discussion of HIT issue, their biases must be acknowledged.

    The fact that the health care system is going bankrupt mandates that we use our precious resources in a way that will benefit all of society, and at this time, the best available evidence would indicate that HIT resources should be re-directed toward the youngest and most vulnerable of our citizens.

    Hayward Zwerling, M.D., FACP, FACP

    President, ComChart Medical Software
    Practicing endocrinologist, Lowell Diabetes & Endocrine Center

  18. You know, I heard the Deputy National Coordinator of ONC speak recently and his message was not to misinterpret JASON. The purpose is to make recommendations for the future, free of the current constraints of HIT.

    I’m not sure I would take their reports as the direction of ONC.

  19. The standards are already under develooment in three separate fora, only one of which is health industry-specific. They should take 18 months or so to formal release. The healthcare industry can either lead or lag the standards. When the industry wants to drag its heels, five years is too soon and the standards are irrelevant (How long has IHE been trying to standardize interoperability?) When industry has an interest in a standard, product rolls out even before the standard is final.

    Privacy is the wild card. If the health industry continues to hide their data uses behind T/P/O and de-identification and avoiding transparency through real-time accounting for disclosures, then the API standards will still be insufficient.

  20. Data, like people, wants to be free.
    People, like people, want control of their destiny.

    People and their data cannot be restrained or confined
    to ridiculously and overthought regulation that is blind

    to human nature
    and sensible nomenclature

    let’s get real about what matters
    and extricate ourselves from big money influenced policy tatters

    The time is now to reclaim the future of medicine
    and stop using words that are paramount to sin

    Provider, EOB, Engagement, Portal, ACO, Fee-for Value
    these are just a few

    Let’s call things what they are and have a real conversation
    about the existential and material doings of human relations

  21. This sounds like a brilliant development and far more hopeful than I have been. Adrian, thanks for the post. I will have to burrow into the report. The 12-24 month time frame seems very aggressive and optimistic. Tell us your thinking on this.

  22. Mike – Apologies for my oversight.

    We agree they nailed it. Now, the next step seems to be squarely with industry but what will AHRQ and RWJF fund next? A realistic privacy preserving solution to patient ID interop? A fresh look at the role of HIEs and other quasi-public patient databases? Bringing health data brokers out of the shadows? Community-led disease and outcome registries?

    There’s much still to do to get to a Learning Health System.

  23. Adrian-thanks for the careful read and thoughtful post about the most recent JASON report.

    Keep in mind that the release of this report was a co-release among AHRQ, ONC and RWJF. AHRQ was the official sponsor. RWJF helped fund it. Those entities also collaborated on the 2013 JASON report.

    In the prior 2013 report, the agencies and RWJF asked the JASONs for recommendations on developing a health care data infrastructure. For this 2014 report, though, the agencies and RWJF asked the JASONs to return to that data infrastructure question–but rather than start with a focus on health care instead focus on the real goal: health. The sense is that health care and the data from health care is important–but there are many other factors driving health and well-being. Plus, there is already an enormous amount of data coming from a range of device sensors. Those data streams outside health care are only going to increase in the future. That means that the health care data sets are one kind of data among many–like one of many apps on your smartphone, for instance.

    The question for this report was then–what happens to the JASONs’ robust health data infrastructure recommendations in that data-rich future? The report is their answer. I personally think they nailed it.

    Thanks again for the thoughtful post.

  24. How about some basics? Like safety, efficacy, usability, accountability, transparency? The learning system should be the EHR itself. Adapting the software running the workflow to enable individualization of care and management of complex illnes is mandatory for meaningfully useful EHR systems, none of which happens with currently available sytems. All of this is bad for patients and the doctors taking care of them.

  25. It’s all in the JASON report. Even the big “providers” and their big vendors _need_ the information to flow from mobile devices, patient-controlled apps, home monitors and all sorts of _new_ industries these vendors do not control.

    JASON calls this closed loop that includes industries beyond US healthcare (Apple, Google, foreigners, etc…) a Learning Health System. If the big providers want my genome data that’s stored in Google http://www.technologyreview.com/news/532266/google-wants-to-store-your-genome/ they will need to play by Google’s rules. If they want my data from HealthKit http://www.cnn.com/2014/09/25/politics/fbi-apple-google-privacy/index.html they will need to play by Apple’s rules. Apple has already declared their brand to stand for personal control. Google and others will likely choose to do the same in time.

    The providers that don’t get on board will see patient information flow out to those that do. ONC can help settle the transition but economics trumps regulation any day of the week.

    What’s really exciting is the privacy piece. Health services “providers” have avoided strong privacy protections such as consent and accounting for disclosures that are routine in other industries. What other industry has “… information exchanges” that are inaccessible to the person themselves? Will the Argonaut Project http://www.modernhealthcare.com/article/20141204/NEWS/312049998 have strong privacy leadership?

  26. This sentence stands out as the beacon:

    “The future is likely to be founded on web industry standards rather than health vendor standards and be much more respectful of both privacy and cybersecurity as a result.”

    Challenge here is getting past those vendor standards, to which many providers are shackled. How do you see that happening in your 12-24 month timeline? How long will the blood flow before we can start the clock?