Why I Am Still Optimistic About the Future of HIT

Apple store NYC

MU stage 2 is making everyone miserable.  Patients are decrying lack of access to their records and providers are upset over late updates and poor system usability. Meanwhile, vendors are dealing with testy clients and the MU certification death march.  While this may seem like an odd time to be optimistic about the future of HIT, nevertheless, I am.

The EHR incentive programs have succeeded in driving HIT adoption. In doing so, they have raised expectations of what electronic health record systems should do while bringing to the forefront problems that went largely unnoticed when only early adopters used systems.  We now live in a time when EHR systems are expected to share information, patients expect access to their information, and providers expect that electronic systems, like their smartphones, should make life easier.

Moving from today’s EHR landscape to fully-interoperable clinical care systems that intimately support clinical work requires solving hard problems in workflow support, interface design, informatics standards, and clinical software architecture.  Innovation is ultimately about solving old problems in new ways, and the issues highlighted by the current level of EHR adoption have primed the pump for real innovation.   As the saying goes, “Necessity is the mother of invention,” and in the case of HIT, necessity has a few helpers.

Helper #1 – Technological change

In 2009 when HITECH went into effect, LAN-based client/server was the standard for EHR systems.   And, in 2009, this meant using a desktop computer. Accordingly, EHR systems were still being designed with desktop computers and their feature set in mind. The keyboard, mouse, and sometimes voice were the only reliable input choices.   Laptops provided some portability, but brought additional issues (e.g., battery life) and did nothing for interface improvements.

In 2009, the iPad was still a year away, and the iPhone was still a novelty. Five years later, we have reliable mobile computers with 64-bit processors that are also able to act as communications devices with video chats and text-messaging.  The library of reliable input methods now includes gestures and more sophisticated voice interaction capability. Together, they open up new ways of interacting with clinical applications that were simply impossible in 2009.

The cloud cannot be overlooked as a major infrastructure advance.  It offers scalability and access to both storage and computing power on-demand, making LAN-based client/server seem… well, primitive. In concert with mobile computers, cloud technology provides the underpinnings for anywhere access, in real time, to even the most sophisticated software.  Consequently, software development and deployment have changed.

Helper #2 – Lower entry barriers

Back in the 1980s, I remember working on a DOS application that required a database.  Initially, I tried writing my own disk access routines, which was not fun.    Later, I grabbed a copy of the Turbo Pascal Toolbox, which provided the disk writing routines I wanted.  It never worked; I could not get it to compile.   Many hours were wasted on something as mundane as saving data to a floppy disk.   Back then, relational databases were very expensive, assuming you had a network to run one on (I didn’t).  Today, databases of every genre are free–relational, document, object, graph—all free.   Operating systems are free (Linux), and there are free frameworks for developing applications in every major language.   Professional-quality programming languages and integrated development environments are free.   Many cloud providers even offer free development accounts for a year or longer.  Finally, collaboration between developers is simple using the cloud, version control, and communication tools.

Delivering a solution to market is easier as well.  One can make provisions for a cloud-based web application during a coffee break–no IT department required. The practical effects of these changes?  The cost of turning an idea into software has decreased greatly in the last seven years.   This means more money allocated to talent and R&D and less to tools and infrastructure.   In a 2011 WSJ article, Marc Andreessen, uber-venture capitalist, stated that the cost of running a basic web application in 2000 was about $150,000 per month. In 2011, this cost had shrunk to $1,500. It is even less now.  Make it less risky to take chances, and more people will take more chances, which means more entrepreneurs entering healthcare to solve the problems every one is complaining about.

Helper #3 – Educated consumers

I have seen an interesting and much welcomed change in my medical colleagues when it comes to buying EHR systems.   Fifteen years ago, the main question I was asked concerning EHR systems was which system was the best.  My answer was always, “It depends…”  No one ever liked that answer. When I explained that implementing an EHR was not just about selecting a system, instead it was actually about changing everything about the way one practiced, no one liked that either. One clinician who sought my advice, when told that he would have to map the workflows in his practice to make sure the EHR would not disrupt everything, looked at me incredulously, and said, “Are you joking?”   These days there are enough horror stories and communal misery that it is no longer necessary to convince docs that there is no one-size-fits-all system or that detailed planning is essential—everyone gets it. The upside is that healthcare professionals now have plenty of hands-on experience with EHR systems, and it has made them become much better consumers.

Clinicians are demanding readable notes, ease-of-use, productivity enhancement (or at least no loss), and lower ownership costs.   Even better, not only are they demanding better systems, but they are also beginning to discuss software requirements.  Clinicians will not get better systems until they demand them.   Fortunately, these voices are growing louder, and I am very happy to see it.

Helper #4 – The MU certification death march

Renovations are more costly and take more time than doing the same thing from scratch.  Changing requirements over the course MU are, once again, proving this to be true.  Vendors are being forced to change their products at a breakneck pace (at least in software engineering terms), and the effects are showing up in the form of delayed updates, bugs, and lagging performance.

Vendors have to serve too many masters.  They have to keep up with MU certifications, and still provide high quality customer service.  In trying to do both, it seems many are doing neither. As a result, they have less time to concentrate on business imperatives such as market share and new product features.   Angry clients who are willing to switch products, and vendors that are treading water, make Helper # 2 even more of a threat.

New market entrants with products that are stable, mobile-friendly, and available can compete for customer loyalties without the liabilities of vendors already in the market. For one, they can build a system from scratch and not have to deal with backward capability woes.  Finally, vendors that have been around for awhile are more likely to stick to what is working than try something completely new, which always provides an opening for a new entrant—Google vs. Yahoo, Apple vs. all phone manufactures, Amazon vs. Borders–you get the idea.

Helper #5 – Clinical informatics research is more varied and plentiful

Even though electronic record systems have been around for years, today there is more research on implementation, interface design, usability, workflow, and security than ever before.  Reports from AHRQ and NIST provide very useful information that is helpful to designers as well as to those implementing systems. Data quality has become a research focus, as has extracting data for clinical research. Today we have more information than ever before on how to design good systems.

Workflow technology, which is important for decision support and usability, has matured significantly over the past 15 years.   Business process management suites and workflow modeling tools are plentiful, and there are good open source tools available.  Companies seeking to enter the clinical care systems market now have access to informatics, human factors, usability and workflow research that has never existed before.  So, contrary to what one might think, companies designing products from scratch have a leg-up. They can readily take advantage of information that is difficult for companies with legacy products to incorporate.

Current EHR systems are from an era that considered electronic access to information as being the key paradigm for clinical care support.  Now, five years into a national experiment that has resulted in substantial adoption of certified EHR systems, we find that patient engagement, team collaboration, information exchange, usability, and workflow capability are just as important as information access in supporting clinical work and quality care.  Times have changed and so must the systems that support clinical care.

New technology, eager entrepreneurs, discerning clinicians, distracted vendors, and plentiful research have set the stage for the debut of the next generation of clinical care systems.   Expect the curtain to go up within the next five years, maybe sooner.


14 replies »

  1. There are 3 fundamental aspects of workflow in the digital era: physical tasks, IT (EHR) tasks, and cognitive tasks. Every certified EHR has to have an audit trail to comply with HIPAA, given that every time ePHI is created, viewed, updated, transmitted, or deleted the transaction must be “date-time/who/what/about whom” captured in the audit trail log.

    The ePHI audit log, to me, is a workflow record component. It can’t tell me WHY front desk Susie or Dr. Simmons took so long to get from one transaction element to the next — i.e., physical movements or cognitive efforts — but it can tell me a lot, adroitly analyzed.

    I worked for number of years as a credit risk and portfolio management analyst in a credit card bank. We had an in-house collections department that took up an entire football field sized building, housing about 1,000 call center employees. I had free run of the internal network and data warehouse. One day I just happened upon the call center database and the source code modules (written by an IT employee in FoxPro, which I already knew at an expert level). I could open up the collections call log and watch calls get completed in real time. We were doing maybe a million outbound calls a month (a small Visa/MC bank).

    (My fav in the Comments field was “CH used fowl language,” LOL)

    It was, in essence, an ongoing workflow record of collections activity.

    I pulled these data over into SAS and ground them up. I could track and analyze all activity sorted by any criteria I wished, all the way down to the individual collector level. I could see what you did all day, and what we got (or didn’t) for your trouble.

    I was rather quickly show upper management “Seriously? You dudes are spending $1,000 to collect $50, every day, every hour” etc. The misalignment was stunning. I started issuing a snarky monthly summary called “The Don Quixote Report” with a monthly “winner.” …Yeah, we called this hapless deadbeat 143 times this month trying to get 15 bucks out of him…

    Well, it didn’t take long to squelch all that. We saved the bank 6 million dollars in Collections Department Ops costs that year via call center reforms. Didn’t exactly endear me to the VP of Collections, whose bonus was tied to his budget.

    Gimme a SAS or Stata install and SQL access to the HIT audit logs, and I will tell you some pretty interesting (Wafts-of-Taylorism 2.0) workflow stories.

  2. Many of the billing and payment processes (and perhaps the mindset) hail from the punch card era. Payment for clinical work has been based on codes and the number of words written for lack of better measures. I wonder how much of the dysfunction in the system is due to the limitations of the tools used to manage it.

  3. Ross, thanks for your comment. Actually, I think the days of the monolithic EHR are passing quickly. That type of design was necessary before there was a global network available that allows near instantaneous links between systems. The systems we have today reflect the realities of the era in which they were designed, and for most clinical software this means LAN-based client/server.

    High-speed network access from anywhere using a 64-bit mobile device just became possible last September. Mobile apps are just beginning to reflect this increase in computing power. It wasn’t long ago that video-conferencing required special equipment and going to a dedicated site. Now, it can be done when eating lunch. Software designs that reflect all of the current capabilities of mobile computing do not exist yet, but it’s only a matter of time.

    The value of workflow technology is two-fold. First, all clinical work consists of processes, some of which can be assisted by software better than others. The second, and possibly most important aspect of workflow technology, is that it forces one to pay attention to the details of how people do their jobs.

    Clinical work involves task sequences, information use, and resource interactions. Formal work on workflow patterns has been around in the computer science world since at least 2000 (http://ehrscience.com/2013/08/19/workflow-patterns-part-i-a-pattern-based-view-of-clinical-workflows/). Yet, workflow patterns are nearly unheard of in health care. Without formal models of clinical work, how can systems be designed to support what clinical professionals do?

  4. There is so much that has improved and that we have good reason about to which we can be optimistic. This well-written piece is a good summary.
    I differ about two items–neither has to do with the quality of the HIT software.
    1. While it’s true that a new vendor could write an EHR from scratch without the old baggage, I doubt anyone could really attempt this because of the hundreds of thousands of pieces required and the complexity of integration.
    2. I think you overestimate the value of workflow software to have serious impact. So much of workflow is determined by power, finance, etc etc, that even the most helpful software would run into a brick wall.

    Again, much to be thankful for….and we all hope for improvements.

  5. Physicians really don’t hate their EHRs, they hate what they have to do with them: MU, ICD-9/10/11, CPT, P4P, CYA med/mal. As long as EHRs are designed to work in a dysfunctional system, they’re going to be hated.

  6. Thanks, Dr. Carter. At AHRQ we are passionate about building the evidence on how health IT can improve health care quality, and we look forward to providing more research findings on usability and usefulness in the months and years ahead. We’re grateful for our colleagues in the research community and the work they do to support this goal. Here’s to the future-


  7. Bill, thanks for your comment. I, of course, agree that the availability of skilled professionals with clinical informatics training will help in ushering in the next generation of clinical care systems. The changes taking place in clinical informatics are wonderful to see. One area of clinical informatics training/education that I think needs both greater emphasis and better tools is workflow analysis/modeling. Creating better clinical systems requires process awareness.

    Thanks for asking about the book. As with any book, the publisher makes the ultimate decision regarding when/if a new edition will be released.

  8. Thanks Jerome, this is a great piece, and I share your optimism. I am not uncritical of the things we have done wrong in HIT, but we need to understand and learn from what has been done wrong, and continually strive to improve it.

    I would, however, change one of your helpers, namely #5. I would instead say, Clinical informatics is developing as a discipline and will lead to better HIT. The research you mention is part of it, but I would also add the people who are emerging with the knowledge and skills to apply informatics in clinical settings. There is of course the new physician subspecialty of clinical informatics that will provide a cadre of leaders with the appropriate expertise. But clinical informatics is not limited to physicians, and those with other backgrounds, both clinical and non-clinical, will also lead the way.

    By the way, do you ever plan to publish another edition of your book on EHRs? I would love to see a new edition reflecting the current world of EHRs.

  9. We got here because of two assumptions that seemed quite reasonable at the time. The first was that clinical care would be improved by better access to patient information. The second assumption was that better access to information would automatically result in increased clinical care productivity. The first assumption was correct, the second one, not so much.

    Better access is useful, which is why most EHR users would not want to go back to paper. However, what we are discovering courtesy of the EHR adoption experiment, is that clinical work is about more than access to information. Information repositories are helpful, but support for clinical work requires software designs that address the routine things that clinicians do 100 times each day. This type of information was hard to come by before. However, thanks to the growth in informatics research that looks at workflow/usability issues and clinicians becoming more vocal about their likes and dislikes, design guidance is increasingly available.

    As for the pessimistic list, Googling MU stage 2 should do the trick…

  10. Indeed. There are clearly many, many reasons to be optimisitic about the future of HIT – all the more reason for us to ask:

    How did we get here?

    I’d like to see a companion “Reasons I’m pessimistic about the future of HIT” list. I’m afraid it would be a long one .