Todd Park Was Right…Now What?

In March of 2005, I staffed an interview between Todd Park and Steve Lohr of The New York Times in the cafeteria of the old New York offices of the “Grey Lady.” At the time, Park was heading a very small web-based start-up company that was trying to convince medical groups – and on that day, a leading national technology business reporter – that web-based “cloud” technologies would become mainstream in the healthcare IT industry and were the only logical means to get the hundreds of thousands of independent U.S. doctors and their small offices to go digital.

At the time, Lohr, one of the foremost technology reporters in the country covering IT giants like Microsoft, IBM and Intel, had just started covering Health IT upon the appointment of Dr. David Brailer as the nation’s first National Health Information Coordinator (or, as many called him back then, the “Health Information Czar”). In fact, Lohr had just gotten back from attending the annual HIMSS Conference in Dallas where he met with CEOs of “legacy” healthcare IT behemoths like IDX (now GE), Siemens, Cerner, Allscripts, McKesson and Epic.

In his first article addressing Health IT adoption in the U.S., Lohr touched on what he felt was the core challenge to achieving widespread EHR adoption: getting small medical practices to adopt and actually use these systems – something that had eluded the industry and those legacy IT vendors for many years. On the topic of getting small practices to adopt EHRs and the potential harm to the industry and the Bush Administration’s efforts if they didn’t, Dr. Brailer told Lohr, “The elephant in the living room in what we’re trying to do is the small physician practices. That’s the hardest problem, and it will bring this effort to its knees if we fail.”

Last week President Obama appointed Todd Park as the new Assistant to the President and U.S. Chief Technology Officer, with the responsibility to ensure the adoption of innovative technologies to support the Administration’s priorities including affordable health care. This got me to thinking.

Since taking office, President Obama has made some strong moves to champion the adoption of EHRs through the passing of the HITECH Act. This act, combined with the existing relaxation to the existing Stark anti-kickback laws, has actually enabled a spike in adoption of EHRs due to medical groups’ efforts to qualify for Meaningful Use dollars. But it has also had some unintended consequences that Mr. Park may now find himself in a unique position to rectify if he stays true to his support of cloud computing.

Over the last two years the number of independent medical practices, often small ones, that have become hospital-owned has more than doubled. This is a direct result of doctors selling out to their local hospital to use their legacy-based EHR (the kind made by the companies Lohr met with at HIMSS in 2005) in an effort to be a meaningful user of a clinical system and get incentive dollars. But rather than turning to more modern, web-based or cloud-computing EHRs, these doctors are now on the very systems that have been in place for decades, which weren’t adopted by the market based on actual value.

Some pundits predict and fear that this shift could lead to greater hospital vertical integration and subsequent pricing pressure on local insurers. Not such a bad thing, I guess… unless these newly owned doctors are sending more and more patients to the hospital for procedures at higher rates. So in gaining higher adoption of EHRs, do the means justify the end if the result increases healthcare costs and makes affordable care in the U.S. harder to achieve?

So how do you get doctors and practices of all shapes and sizes to adopt EHRs without the potential of driving up costs and driving down affordability? Well, any forward thinking person working in the health IT industry that I have spoken to knows that cloud-based technology is the logical solution for allowing independent doctors to go digital, get connected, and get their government incentives while staying independent.

The problem remains the same as it was in 2005 during that interview at the New York Times. The dominant health IT vendors have not truly been pressured to change their model and technology platforms from the client-server to a true cloud-based platform (some third-party hosted version of Cerner or Allcripts doesn’t count). Todd Park knows this and certainly wouldn’t have used legacy-based technology to launch the much publicized, forward-looking initiatives – HealthCare.gov and the Health Indicators Warehouse – that he effected as the CTO of the Department of Health and Human Services.

With proposed Meaningful Use Stage 2 standards just released a few weeks ago, it’s not too late to achieve what two federal Administrations have gone after while still spurring the kind of innovation and market disruption Todd Park has made his stellar career on.

Web-based technology, if widely adopted, will have a transformative impact on healthcare. Todd Park knew that seven years ago. Now that he’s our national CTO, I hope he will stay true to his vision and work to modernize healthcare through the power of the cloud.

John Hallock is Vice President of Corporate Communications at CareCloud. He joined the company in February of 2012 and is responsible for overseeing CareCloud’s communication and public relations programs with media and industry analysts while helping to promote the company’s growing product line and drive overall brand awareness. Prior to joining CareCloud, Hallock held the position of Director of Global Corporate Communications at athenahealth.

9 replies »

  1. Jeff, I don’t believe there is a dichotomy here at all. I just think that the method of delivery is largely irrelevant. Software need not be browser-based in order to be Internet-based.
    There are several browser-based EMR systems, which are meeting all regulatory criteria and doing well in this regulation driven market. By the same token, there are non browser based products that are doing at least as well and many that are doing much better, both in market share and customer satisfaction. Cost of ownership over multiple years is not lower for the browser based crowd.
    I agree that upgrades are easier when centrally done, but there are very few providers that can afford the luxury of not upgrading their software, since the Meaningful Use system requires that you do so on a regular basis. Either way, I am not certain that the vendor should function as an enforcer necessarily.
    As to technology per-se, I would suggest that when lifting the covers, some of the supposedly cutting edge “cloud” solutions have a core of what could be considered antique programming, and so do the non-browser guys and everybody is busy migrating to newer languages.
    The main point I’m trying to make is that there is no substantive value derived from a “cloud” label.

  2. I think you’re presenting a false dichotomy. The choice isn’t big, feature rich, installed on premise vs. minimalistic, can’t handle doing what you need cloud. Unfortunately I can’t find great numbers for the “legacy” systems like Epic, but searching for the “cloud” company this article says John used to work for: http://www.athenahealth.com/meaningful-use.php there appears to be at least one vendor who meets the government requirements and is not an installed on-premise “legacy” piece of software.

    As for pivoting, my provider uses Epic and I generally watch them as they are using it. The technology looks like it’s straight out of 1998, and when they lock the machine it says 2004 on the login prompt. Now I’m not saying that Epic can’t write a good piece of software, but if their providers aren’t going to upgrade to the latest greatest versions constantly then their install base is always behind the times and using tools that do not meet new requirements. The on-premise solution is great for a sales demo, but what you see in your sales demo is what you are still using years later as the newer tools have passed you by. It’s a lot easier to read a new requirement, build a solution for it, and push that solution out to all providers when you are a hosted solution than when you have one of the “legacy” installs.

  3. I would agree with your notes, except that the vendors who seem to be pivoting better and faster are those usually labeled “legacy” in minimalistic cloud circles. Epic releases new versions on a regular basis and today’s Epic bears little resemblance to what it was 10 years ago. The equally legacy eCW product is the de-facto state of the art, including mobile health, patient engagement, analytics and interoperability. If you’re looking for pivoting abilities, my impression is that medium to large privately held companies, exclusively dedicated to EHR, are where the action is.
    Building an EMR from scratch is not easy and as ePocrates found out, customers are not looking for bare-bones stuff, largely because the government closed the market to simplistic documentation solutions, and rightfully so.

  4. Obviously the distinction isn’t “cloud” (a word I hate) vs. non-cloud. The distinction is agile and able to quickly pivot and move toward the needs of the market vs. a system that was built to solve the healthcare IT needs of 2002. The “purchase a large, ingrained, truly legacy systems where you do a big 6 – 12 month (or longer) implementation and install a bunch of hardware to run their software that is maintained by a local IT staff at a hospital and costs $100million that you need to amortize over 10 years to make it seem financially logical but causes you to be using 10 year old software by the end of the licensing agreement” business model should be on its way out. It simply does not stand up against the ability to signup for a hosted (I refuse to use the word cloud) solution that costs less, does not have vendor lock-in, has a shorter implementation, has reduced IT fees, and adds needed functionality much, much quicker. Someone who is in the middle of an implementation of Epic right now is going to be about 12 years behind the times in 10 years.

  5. Nothing.
    Cloud is irrelevant, and the opposite of “cloud” is not “legacy”. The “cloud” stuff has been around long before HITECH and didn’t have any more success than the client/server variety. The most innovative (and popular) EHR for ambulatory practice today is not cloudy at all, although it is often hosted remotely.
    Interoperability outputs are not a problem either. ONC is defining standards, vendors are implementing them and Meaningful Use 2 will require that everybody takes interoperability for a spin. Technology is not the issue here.
    There is currently no business model compelling large providers to exchange information. The upcoming payment reforms may change that and if they do, there will be plenty of interoperability, cloud or otherwise.

  6. The “Cloud” is the fashionable IT cliche of the decade. Like, “remote server hosting” had become passe, and a new buzzword was needed for the DigeratiWorld Trendsetters and the rubes they might fleece. Just like “web-based subscription EHR” morphed to the “ASP model” (Application Service Provider). and then on to “SaaS” (Software as a Service), which will soon become __________?

    So, now, purportedly, your your ePHI are gonna be out there somewhere “in the Cloud,” interoperably, on-demand accessible.

    (HIPAA sand-in-the-gears notwithstanding)

    The Stage 2 “interoperability” requirement and the whole “transparency” vision are going to require a big-time re-think of the EHR vendors’ business model.

  7. I wonder, is the problem a lack of cloud based solutions, or a lack of lean young companies using powerful software with out-of-the-box interoperable outputs? How much is necessarily saved by having the software and storage live in the cloud? While the cloud may be an asset for HIT use among smaller offices and certainly helps with scalability, the bigger problem it seems to me is a lack of true interoperability, which has to do with the business model of incumbent vendors trying to keep their products sticky and less of a commodity. You could have a cloud-based product and still lack interoperability, or a client-server based product and achieve interoperability.

    What am I missing?