One of the more interesting guys in health tech is Dale Sanders who’s been data geek/CIO at multiple provider organizations (InterMountain, Northwestern, Cayman Islands), was in the nuclear weapons program in the US Air Force back in the day, and now is the product visionary at Health Catalyst. Health Catalyst is a very well-backed date warehousing and analytics company that has Kaiser, Partners, Allina and a host of other providers as its customers and investors (and has been a THCB sponsor for a while!). I’ve interviewed CEO Dan Burton a couple of times (here’s 2016) if you want to know more about the nuts and bolts of the company, but this chat with Dale at HIMSS17 got a tad more philosophical about the future of analytics–from “conference room analytics” to “embedded decision support.” I found it great fun and hope you do too!
The term “Big Data” emerged from Silicon Valley in 2003 to describe the unprecedented volume and velocity of data that was being collected and analyzed by Yahoo, Google, eBay, and others. They had reached an affordability, scalability and performance ceiling with traditional relational database technology that required the development of a new solution, not being met by the relational data base vendors.
Through the Apache Open Source consortium, Hadoop was that new solution. Since then, Hadoop has become the most powerful and popular technology platform for data analysis in the world. But, healthcare being the information technology culture that it is, Hadoop’s adoption in healthcare operations has been slow.
Date: Wednesday, February 24, 2016
Time: 1:00–2:30 PM ET
In this webinar, Dale Sanders, Executive Vice President of Product Development at HealthCatalyst will explore several questions:
- Why should healthcare leaders and executives care about this technology?
- What makes Hadoop so attractive and rapidly adopted in other industries but not in healthcare?
- Why is Big Data a bigger deal to them than healthcare?
- What do they see that we don’t and are we missing the IT boat again?
- How is the cloud reducing the barriers to adoption by commoditizing the skilled labor impact at the local healthcare organizational level?
This webinar is intended to be valuable to both technical and non-technical audiences, as we explore the convergence of Big Data technology and Healthcare’s Age of Analytics.
The number of mergers, acquisitions, and collaborative partnerships in healthcare continues to skyrocket. That’s not going to change for the next few years unless the FTC decides to be more restrictive. In all of these activities, older generation executives (I can say that because I’m older) have underestimated the importance and difficulties—technically and culturally—of integrating data and data governance in these new organizations, and the difficulties are exponentially more complicated in partnerships and collaboratives that have no formal overarching governance body. In 2014, 100 percent of Pioneer ACOs reported that they had underestimated the challenges of data integration and how the lack of data integration has had a major and negative impact on the performance of the ACOs.
Seamless Data Governance
After 33 years of professional observations and being buried up to my neck in this topic, especially the last two years as the topic finally matures in healthcare, I’m convinced that the role model organizations in data governance practice it seamlessly. That is, it’s difficult to point a finger directly at a thing called “Data Governance” in these organizations, because it’s completely engrained, everywhere. As I’ll state below, it reminds me of the U.S. transition in the early 1980s when organizations finally realized that product quality was not something that you could put in an oversight-driven Quality Department, operating as a separate function. Quality must be culturally embedded in every teammates’ DNA. Data governance is the same, especially data quality.Continue reading…
In a detailed letter sent this week to CMS Administrator Marilyn Tavenner and National Coordinator Karen DeSalvo, MD, the American Medical Association presented a long list of ideas to make Meaningful Use better for doctors.
The AMA warned that “unless significant changes are made to the current program and future stages,” doctors will drop out of the meaningful use program, patients will suffer as existing EHRs fail to migrate data for coordinated care, thousands of doctors will incur financial penalties, and new delivery models requiring data will be jeopardized.”
All of which is true. But the AMA didn’t go far enough.
Meaningful use is well intentioned, but like a teacher who “teaches to the test,” the program has created a byzantine system that might pass the test of meaningful use stages, but is not producing meaningful results for patients and clinicians.
A formal study published in the April 2014 issue of JAMA Internal Medicine reveals there’s no correlation between quality of care and meaningful use adherence. This study validates what common sense has told many of us for the last few years.
Meaningful Use Stage 1 was a jump-start for EMR adoption in the industry. That’s a good thing, I suppose, although meaningful use also created a false economic demand for mediocre products. It’s time to put an end to the federal meaningful use program, eliminate the costly administrative overhead of meaningful use, remove the government subsidies that also create perverse incentives, and let “survival of the fittest” play a bigger part in the process.
Let the fruits of EMR utilization go to the organizations that commit, on their own and without government incentives, to maximizing the value of their EMR investments toward quality improvement, cost reduction, and clinical efficiency.