Maybe it is just the shock of being post Labor Day and realizing that summer is fading into the rear view mirror or maybe it was something I ate for breakfast that spurred new hope. But I think that this is the year that the patient centric approach to data in life sciences finally takes off. And along with that launch will come the massive rapid migration to cloud and data lake architectures for pharma data.
Really? Why now you may ask?
Yeah – that’s right. Every group I have been talking to is worried that they are sitting atop a jigsaw puzzle of siloed data resources that can’t be assembled fast enough to meet the needs of business and scientific users. Organizations are thinking that they can’t answer their questions about why drugs work in some patients and not others if they can’t link phenotype and genotype data. Groups can’t look across clinical trials. They can’t look beyond and between clinical trials and EMR data. Progressive safety groups are considering using automation and cognitive computing to lower costs in processing events so they can then look in parallel to expanding sensing new signals into 10X current volumes of data within large real world data sets.Continue reading…
It’s been an exciting 2016 already in the realm of cloud computing and patient engagement. As I was preparing for the HIMSS16 conference, I was reflecting on how things are moving so quickly with the addition of new technologies and yet some of the core challenges around gathering the information to provide better medicine are still in the dark ages. So here is the question ringing in my head for this year at HIMSS…
How much longer must we wait to finally have a ‘patient cloud’ – a sharable and relatively complete cloud based health record for each patient?
This is seemingly an obvious prerequisite condition so that providers can deliver better care for patients. The patient controlled medical record is an old idea that goes back to the Guardian Angel manifesto published in 1994 at the dawn of the Internet era and yet 22 years later we have haven’t achieved the first steps of the fundamental core of a universal life long patient record.
MU stage 2 is making everyone miserable. Patients are decrying lack of access to their records and providers are upset over late updates and poor system usability. Meanwhile, vendors are dealing with testy clients and the MU certification death march. While this may seem like an odd time to be optimistic about the future of HIT, nevertheless, I am.
The EHR incentive programs have succeeded in driving HIT adoption. In doing so, they have raised expectations of what electronic health record systems should do while bringing to the forefront problems that went largely unnoticed when only early adopters used systems. We now live in a time when EHR systems are expected to share information, patients expect access to their information, and providers expect that electronic systems, like their smartphones, should make life easier.
Moving from today’s EHR landscape to fully-interoperable clinical care systems that intimately support clinical work requires solving hard problems in workflow support, interface design, informatics standards, and clinical software architecture. Innovation is ultimately about solving old problems in new ways, and the issues highlighted by the current level of EHR adoption have primed the pump for real innovation. As the saying goes, “Necessity is the mother of invention,” and in the case of HIT, necessity has a few helpers.
In March of 2005, I staffed an interview between Todd Park and Steve Lohr of The New York Times in the cafeteria of the old New York offices of the “Grey Lady.” At the time, Park was heading a very small web-based start-up company that was trying to convince medical groups – and on that day, a leading national technology business reporter – that web-based “cloud” technologies would become mainstream in the healthcare IT industry and were the only logical means to get the hundreds of thousands of independent U.S. doctors and their small offices to go digital.
At the time, Lohr, one of the foremost technology reporters in the country covering IT giants like Microsoft, IBM and Intel, had just started covering Health IT upon the appointment of Dr. David Brailer as the nation’s first National Health Information Coordinator (or, as many called him back then, the “Health Information Czar”). In fact, Lohr had just gotten back from attending the annual HIMSS Conference in Dallas where he met with CEOs of “legacy” healthcare IT behemoths like IDX (now GE), Siemens, Cerner, Allscripts, McKesson and Epic.
In his first article addressing Health IT adoption in the U.S., Lohr touched on what he felt was the core challenge to achieving widespread EHR adoption: getting small medical practices to adopt and actually use these systems – something that had eluded the industry and those legacy IT vendors for many years. On the topic of getting small practices to adopt EHRs and the potential harm to the industry and the Bush Administration’s efforts if they didn’t, Dr. Brailer told Lohr, “The elephant in the living room in what we’re trying to do is the small physician practices. That’s the hardest problem, and it will bring this effort to its knees if we fail.”
Last week President Obama appointed Todd Park as the new Assistant to the President and U.S. Chief Technology Officer, with the responsibility to ensure the adoption of innovative technologies to support the Administration’s priorities including affordable health care. This got me to thinking.
Since taking office, President Obama has made some strong moves to champion the adoption of EHRs through the passing of the HITECH Act. This act, combined with the existing relaxation to the existing Stark anti-kickback laws, has actually enabled a spike in adoption of EHRs due to medical groups’ efforts to qualify for Meaningful Use dollars. But it has also had some unintended consequences that Mr. Park may now find himself in a unique position to rectify if he stays true to his support of cloud computing.Continue reading…