HIMSS Unplugged


HIMSS has opened and closed in Florida and I’m in Boston with snow up to my rectus abdominis. After several years of watching keynote pageants and scarfing up the amenities at HIMSS conferences, I decided to stay home this year.

Writing articles from earlier conferences certainly called on all my energy and talents. In 2010 I called for more open source and standards in the health care field. In 2012 I decried short-term thinking and lack of interest in real health transformation. In 2013 I highlighted how far providers and vendors were from effective patient engagement.

In general, I’ve found that my attendance at HIMSS leads moaning and carping about the state of health IT. So this year I figured I could sit in my office while moaning and carping about the state of health IT.

In particular, my theme this year is how health IT is outrunning the institutions that need it, and what will happen to those left behind.

The scissors crisis: more IT expenditures and decreasing revenues

Although the trade and mainstream press discuss various funding challenges faced by hospitals and other health providers, I haven’t seen anyone put it all together and lay out the dismal prospects these institutions have for fiscal health. Essentially, everything they need to do in information technology will require a lot more money, and all the income trends are declining.

Certainly the long-term payoff for the investment in information technology could be cost reductions–but only after many years, and only if it’s done right. And certainly, some institutions are flush with cash and are even buying up others. What we’re seeing in health care is a microcosm of the income gap seen throughout the world. To cite Billie Holliday: them that’s got shall get; them that’s not shall lose.

Here are the trends in IT:

  • Meaningful Use requires the purchase of electronic health records, which run into the hundreds of thousands of dollars just for licensing fees. Training, maintenance, storage, security, and other costs add even more. The incentive payments from the federal government come nowhere near covering the costs. EHR providers who offer their record systems on the Web (Software as a Service) tend to be cheaper than the older wave of EHRs. Open source solutions also cost much less than proprietary ones, but have made little headway in the US.
  • Hot on the heals of Meaningful Use is ICD-10 compliance, a major upgrade to the diagnostic codes assigned to patient conditions. Training costs (and the inevitable loss of productivity caused by transitions) could be staggering. Some 80% of providers may miss the government’s deadline. The American Medical Association, citing estimated prices for a small practice of $56,639 to $226,105 (p. 2), recently urged the government to back off on requiring ICD-10. Their point of view seems to be that ICD-10 might have benefits, but far less than other things the providers need money for. Having already put off its deadline, the Department refuses to bend further.
  • All providers need sophisticated analytics. This is because both the Centers for Medicare & Medicaid Services and private insurers are moving to bundled payments and “pay for value.” In various configurations, these schemes pay doctors the amount that a payer thinks treatment should cost (in their estimation), not the cost of the actual procedures the doctors performed. Smart institutions that provide well-integrated care and track their patients to ensure adherence to treatment plans can suppress their costs and actually earn more money. But if a provider carries on the way most do–losing track of patients who fail to come for appointments, performing unnecessary tests because they can’t get the results of earlier tests, etc.–its bottom line will suffer. The change is certainly good for health care consumers, but smart treatment also costs money: to collect accurate data, to share it among providers, and to carry out the data crunching that turns up risks among patient populations.
  • Patients and staff are coming to expect other amenities, such as web portals, that require investment in both technology and workflow changes. All these IT factors are compounded by other rising costs in the health care industry. Even the most powerful institutions are having trouble passing those on to payers and consumers, as they have done year after year. Aside from the “pay for value” programs, Medicare costs are being kept down for budgetary reasons, but the government’s intent is good: they expect providers to learn how to be more efficient.
  • Finally, more people are going on Medicaid, especially in states that took up the federal government’s offer to expand Medicaid under the provisions of the Affordable Care Act. Medicaid pays less than most private insurers, and low-income households tend to have more difficult medical conditions to treat. This could be the last straw for many providers.

    So add up the two sides of the balance sheet: precious little black and more red than even a surgeon wants to see. The only way out is to get a lot of data and bang on it until it coughs up the information that is key to reducing costs. But who will provide such statistical wizardry?

    Knowledge is not only power, but wealth

    HIMSS teams each year with firms offering to guide hospitals along the path of wisdom. They can tell you which patients to spend more time with because they’re at risk of being hospitalized (patient stratification). They can optimize your use of rooms and other resources. They can tell you where to invest your limited IT money and get the most savings back. The problem is that doing analytics costs a lot of money, given all the steps involved:

    1. Someone must enter the data to be passed to the analysis. Natural language processing systems can look for important indications in the text entered by clinicians, but it’s very hard to track everything that way. Structured data takes a lot more time to enter, and staff must be trained to do so in consistent terms and formats.
    2. Analysts must detect the relationships in the data, a task that can trip them up in health care because it presents so many confounding factors. For example, an analyst might compare two hospitals without statistically adjusting for their relative mix of procedures, the types of patients that frequent each hospital, etc. Just graphing data without looking for such deeper patterns can be very risky. Organizations adopting predictive analytics must do additional work to generate the predictive models.
    3. Analysts must then create charts or reports showing important relationships. Many people have the training to do this, but not necessarily well. “It’s all too easy to generate graphs that look insightful but that do not communicate statistically sound insights,” says to Arijit Sengupta, CEO of BeyondCore.
    4. Business users must take time to review results. This calls for training in how to review data. Often the results presented aren’t exactly the data needed by the business user, who must now provide feedback to the analyst to help her redo the analysis.
    5. The hospital or clinic has to act on results–calling patients, scheduling visits and tests, etc.

    Steps 2 and 3 contains the actual analytics. But the other three steps are probably even more of a burden on the health care institution.

    Not that the challenges of steps 2 and 3 are minor. Companies across all industries compete intensely for analysts. A realization is growing among businesses of all types–manufacturing, finance, retail–that they are really software businesses. Big data is the watchword of the era, and the dearth of statistical expertise among leading firms in every section leads to insane demand for those skills. Sengupta cites an estimate from the McKinsey Global Institute that the US needs 190,000 more analysts. These experts take a long time to get the needed education and command six-figure salaries.

    Ultimately, according to Sengupta, the total cost of ownership includes the cost of software and the cost of hiring the analysts or consultants–but also major hidden costs such the training of business users, the time these business users need to take away from pressing day-to-day operations to review data analyses, and the wrong directions hospitals can take if users miss key insights or act on apparent patterns that were not statistically sound. The McKinsey report echoes Sengupta’s point and highlights the shortage of 1.5 million data-savvy managers in the US.

    BeyondCore says they have cut costs for all these steps by automating the process of detecting and presenting the key insights to be found in the data, as described in an O’Reilly article. Instead of taking up analysts’ time coming up with different hypotheses or questions and testing them against the data to see what’s relevant, BeyondCore can automatically generate and test millions of such questions to find results both faster and more accurately. At the 2013 StratRx conference, McKinsey reported that, by working with BeyondCore, they replaced five months of manual analysis with a couple hours of automated analysis that they were able to initiate with a single click.

    Dan Riskin of Health Fidelity, calling in from the HIMSS conference, explained to me some of the reasons analytics are so costly and difficult for hospitals. Riskin is a practicing surgeon, a clinical informaticist, a consulting faculty member at Stanford, and an entrepreneur who has created several health-related companies. The issues he discussed break down into two categories: the context and the motivations.

    The context involves the electronic record systems in clinical settings. Most were developed a decade or more ago, sometimes over two decades ago. They require a lot of customization, and local installation and maintenance instead of running in a cloud. Hospitals become accustomed to this way of deploying software and try to fit analytics in around the existing systems. Thus, the analytical tools also require customization and local installation, a fearsome burden for IT staff who are already tasked with keeping wireless networks up, monitoring security, and doing other critical functions for patient care and safety.

    Still, there is hope here because analytics are more modern than most EHR systems. Many firms, including Health Fidelity, accept data from records and do the processing in the cloud. Small hospitals, and those lacking funding, may be able to take advantage of such services.

    The motivations for using analytics do not augur well for making an effective contribution to improving patients. Currently, the hospitals have their eye on Meaningful Use incentive money. This requires them to report things such as how many patients lowered their blood pressure below certain levels. For most patients this is a worthy goal, but for former stroke victims it actually increases their risk of stroke–but the Meaningful Use program makes no distinction between patient types. Thus, incentive payments are a useful starting point, but not comprehensive or flexible enough.

    According to Anand Shroff, chief technology and product officer of Health Fidelity, recent Medicare initiatives such as Accountable Care Organizations (ACO), shared savings programs (MSSP), bundled payment programs, and continued advances in existing programs such as Medicare Advantage could force the health care field to improve care. Several observers have estimated that 70% to 80% of patient information is preserved in unstructured data, such as physician notes. No electronic system can currently capture, in discrete fashion, all the factors that go into evaluating patients’ risk and deciding how to adjust payments for that risk.

    For instance, a patient may smoke, suffer from asthma, and have high blood pressure–all these go into determining how to treat her and how much Medicare should pay to do so. Shroff also endorsed the move to ICD-10, because its more specific diagnoses are critical to high-quality data as a step on the way to more granular terminologies such as SNOMED.

    Health Fidelity’s web-based platform, called REVEAL, extracts meaning from the unstructured data through natural language processing–which Riskin said took 20 years to develop and is a leader in the field–and from an inference engine that finds relationships to highlight risk and quality. The system augments clinical research with models developed in-house, learning from the data, to develop risk assessments.

    Riskin expects that in 2015 or 2016 analytics will move to a higher stage to address patient stratification and risk management. In fact, some hospitals are already doing this to avoid the recent penalties levied by Medicare for patients readmitted within 30 days of a discharge. There is much more they can do here, such as tracking patients with hypertension who are suffering from stress or not taking their medication. As payers require providers increasingly to share risk, analytics to measure that risk will take off. Analytics are of no value in themselves, of course: they must be integrated with workflows in such ways as calling patients who have missed visits.

    Several years off, according to Riskin, analytics will rise to yet greater heights, and will look “deep and long” at patients to figure out what improves quality of care. This requires sophistication and patience–it may take years of data, for instance, to determine the course of hypertension for one patient, including what worked and what did not–but that’s what will really lower costs and make analytics pay for themselves.

    Shroff compared the current state of health care to the adoption of personal computers and ERP systems in other industries during the 1980s and 1990s. Economists noted that the expected productivity gains from computerization failed to show up some 20 years, calling this the “productivity paradox.” With the rush to electronic health records and information exchange, the health care field has finally begun its own productivity march.

    Currently, IT is focused on the check-boxes (making sure they meet Meaningful Use guidelines and start using ICD-10). According Shroff, innovative uses for health IT are on hold. It will be at least two more years before health providers can start to implement the changes that make effective use of the data produced.

    In the meantime, Shroff validated one of the theses of this article: that hospitals will survive through mergers and acquisitions. He expects that the US will divide into 200 or 300 regions, each dominated by one health system that buys up or partners closely with the rest. This will basically carry out the Accountable Care Organization model.

    The big regulatory change Riskin is hoping for is a requirement for EHRs to exchange full clinical data. Currently, interoperability is required only for patient summaries, which are sufficient for billing, but lack the depth of information required to tell the deeper story about the future course of the condition, effective therapy, and risk.

    I asked Riskin whether he thought data from many institutions must be combined to produce useful analyses. He did not feel the answer is clear yet, but pointed out that each institution is already sitting on huge amounts of valuable data and not making good analytic use of it. Furthermore, health care providers are merging, so data is scaling up.

    The bigger your patient population, the more subtle are the trends you can find. Public health, at the upper end of the size range, offers great benefits because agencies can collect statistics on an entire population. But the kinds of data they gather are a lot slimmer than what hospitals know.

    Therefore, one national solution to the scissors crisis would be to take patient data (anonymized as well as we know how to do) from each health provider and create national databases. Analysts could then line up for access (signing terms of service that prevent them from re-identifying patients and making malicious use of the data). A market for innovative data uses could then emerge. The costs could be subsidized by the government to make them accessible to all providers, even small rural ones.

    But in an industry where patients report they can’t even get their records transferred electronically from one hospital to another in the same system, how soon can we expect that to happen?

    Small providers without many resources will therefore cope with the squeeze another way: mergers and acquisitions. The more successful institutions will buy up all the others; we’re seeing that already.

    Case study in small hospital: Elyria Medical Center

    It’s refreshing to talk to someone applying Meaningful Use and quality improvements in an everyday hospital, as a break from listening to industry leaders from well-endowed institutions. I had a chance to interview Charlotte Wray, CIO of a 387-bed hospital called University Hospitals Elyria Medical Center (formerly EMH Healthcare) in Elyria, Ohio, a community of about 54,000 people. The story of this hospital shows both that small community providers can achieve major advances, and that they face daunting challenges along the way.

    But even EMH Healthcare reached the limit of what it could achieve as an independent. So on January 1 of this year, they completed an integration with University Hospitals, a network of 14 hospitals and 26 outpatient health centers in 16 Northeast Ohio counties. Wray explains that this will give them the support and resources to continue their quality improvement efforts and to focus on high quality and great value.

    Before the merger, EMH Healthcare managed to:

    • Install Soarian Clinicals as their electronic record system, and take advantage of the control it offers over workflow and clinical decision making. For instance, Wray explained, the Soarian workflow engine can provide decision support to clinicians and decrease the variability in the flu and pneumonia vaccination administration process. Two thousand more patients are now getting those vaccines, and the hospital misses fewer than 2% of those who should. Soarian monitors 86 decision points that patients pass through from entry to discharge, each decision point a source of potential variability.
    • Integrate the clinical EHR with the rest of the hospital functions (billing, payroll, pharmacy, and so on). The hospital relied heavily on HIT consulting firm Stoltenberg Consulting for strategizing and project management.
    • Integrate Allscripts Enterprise, used by their ambulatory clinicians, with their patient portal and their Health Information Exchange, CliniSync.
    • Attest to Meaningful Use Stage 1 for the past three years, and work toward Stage 2 for the third quarter of this year.
    • Reach HIMSS Stage 6 status. (HIMSS defines eight stages of EHR use, and Stage 6 is the second highest.)
    • Plan for ICD-10 compliance on schedule.

    These achievements required careful apportioning of resources and the involvement of clinical staff as internal consultants. Nurses were educated to understand the system and train others in its use. Every change was scoped out financially and presented to management as fulfillment of a business need.

    Asked what challenges are faced by hospitals like hers, Wray said it’s becoming more difficult to deliver cost-effective care. They have access to a lot of useful data that can be analyzed to better manage their patient populations. And finally, patient engagement becomes crucial because, for chronic conditions, it is the patient making the main decisions that affect their health.

    Can all American health institutions pull off what has been accomplished by Elyria, together with University Hospitals? The federal government expects it of them. In fact, anyone breathing should expect it of them. It is our lives and our health that depends on it. But the current course of the industry, and of access to necessary technology, does not engender confidence.

    Turning over a new electronic health record

    Even as HITECH and Meaningful Use regulations push for the adoption of EHRs, the health care field knows their weaknesses all too well. There is a sense that EHRs need to shake off the 1980s-era software models that make them hard to install and use, difficult to integrate with the web, and resistant to data sharing.

    One of the companies taking a fresh approach is Viztek, which received a good deal of buzz just for being the anomaly of a new company trying to enter a space where a shake-out is imminent, and of coming from an unusual place as a manufacturer of radiology systems. But there were two more important aspects of Viztek’s new system that interested me more: their adoption of modern software technology and their focus on simple interfaces.

    The technology side is illustrated by their basing the EHR on the relatively new JavaScript back-end technology, Node.JS, their use of a PostgreSQL database, and their provision of the EHR through Software as a Service. They also integrate with existing services, such as accepting input from voice recognition on clinicians’ mobile phones, instead of trying to bring all technology under their own umbrella.

    The interface innovations are hard to pinpoint, but I can just say that VP Steve Deaton is acutely aware of the tension between the complexity of doctor/patient interactions and the low computer sophistication of many clinicians. Viztek is trying, rather like Apple in computer systems or Mercedes Benz in cars, to provide complex functionality in extremely simple packages. I’m looking forward to seeing whether companies such as Viztek can pull the EHR out of the mire.

    The high cost of talking to each other

    Some of us remember the days when long-distance calls were expensive. You called your brother across the country once every few months when something important happened such as a graduation or a death in the family. Nowadays we Skype with people on other continents and don’t even think about where we’re phoning within the US. (In fact, you can’t tell from people’s cell phone numbers where they’re currently located.) But for clinical settings, sending data around is like making phone calls in the 1960s.

    Hospitals can spend millions of dollars trying to get these systems to work together. I had a chance to talk to a company offering a more affordable solution: Applied PilotFish Healthcare Integration, a wholly owned subsidiary of PilotFish Technology. They have taken on the infinitely multifaceted task of translating health data between systems.

    Initial installation for PilotFish costs about $100,000 for the standard end-user, with fees in the tens of thousands for ongoing licensing and use of its administrative console. Additionally, PilotFish offers much lower prices to solution providers that bundle software with the solution. Hospitals that have gone through mergers, use different patient records in different departments, or even want to integrate their financial records with their clinical records and their pharmacy records can use PilotFish’s service.

    Businesses are often advised to outsource tasks that are not part of their core functions. Believe me, whatever business I happened to be in, I would jump to outsource the translation of data from one obscure, legacy proprietary format to another–and nowhere as much as in health care, a home for extremely complex data formats that have evolved in awkward directions over decades (just try, for instance, to get your head around the HL7 Clinical Document Architecture, which forms the basis for standard data exchange between patient records).

    Data can exist in formats ranging from highly structured (but proprietary) electronic record systems to web-based APIs, EDI formats, and even flat files. PilotFish has brought many of these formats into its service and provides them through a simple menu interface. PilotFish Healthcare CEO, Neil Schappert, said a health care provider needs only about 20 minutes to configure an interface that provides real-time connectivity and data transformation between disparate systems.

    An example of a PilotFish implementation is the automated medication dispensing carts provided by Emerson Electric Corporation’s InterMetro subsidiary, which must be integrated with billing, electronic health records, and pharmacy systems within the hospital.

    As long as the health care field depends on proprietary solutions from many different vendors who lack the incentive to interoperate in a standard manner, integration problems will remain a headache. Without integration, of course, analytics are not much use. Both have come to play crucial roles to staying in business and providing good health care.

    Who was talking about these problems at HIMSS? Let me know…

    Andy Oram is an editor at O’Reilly Media, a highly respected book publisher and technology information provider. His work for O’Reilly includes the influential 2001 title Peer-to-Peer, the 2005 ground-breaking book Running Linux, and the 2007 best-seller Beautiful Code.

    5 replies »

    1. I appreciate your insights here. I’m amazed by what population health management and risk stratification could mean for US health care. I’d love to read a prospectus on what health care could look like in 10 years if we can successfully implement this sort of technology and approach.

    2. About the prediction that consolidation will result in regions that realize ACO model vision. That contrasts with Centers of Excellence and medical tourism approach that many employers favor and seems to be happening with transplantation and cancer treatment. How about seniors that snowbird? It seems to me that tethering health info to a single system won’t work no matter how much easier that makes it for large healthcare systems.

    3. Anticipating the angst of October 1st, we’ve developed and recently released an app that may help alleviate the anxiety related to ICD-10.

      You can check it out at http://icd10doc.com

      Your feedback would be appreciated

    4. “But for clinical settings, sending data around is like making phone calls in the 1960s.”

      You nailed it right there. I’ve been kinder in my commentary by saying that healthcare IT is partying like it’s 1985, but I believe your observation is more correct.

      Plus, with all of the exhaust fumes still emitting from HIMSS, I have yet to see a real breakdown of when the magical day will arrive when EHR technology delivers direct value to *patients* – so far, no soap there. We still have to either hand-carry or bird-dog records delivery across care delivery systems.

      Patients are at Meaningful Use Stage -50 …

    5. “Who was talking about these problems at HIMSS.”

      Me. Before, during, and after. Google REC Blog.

      Great article. I’ll be citing it. You’re all over it.