Two years ago we wouldn’t have believed it — the U.S. Congress is considering broad privacy and data protection legislation in 2019. There is some bipartisan support and a strong possibility that legislation will be passed. Two recent articles in The Washington Post and AP News will help you get up to speed.
Federal privacy legislation would have a huge impact on all healthcare stakeholders, including patients. Here’s an overview of the ground we’ll cover in this post:
Six Key Issues for Healthcare
We are aware of at least 5 proposed Congressional bills and 16 Privacy Frameworks/Principles. These are listed in the Appendix below; please feel free to update these lists in your comments. In this post we’ll focus on providing background and describing issues. In a future post we will compare and contrast specific legislative proposals.
Big news from perennial Health 2.0 favorite Healthline. The fast growing consumer destination site is spinning off a new subsidiary aimed at the market of provider-side analytics. Here’s a quick announcement of the news from CEO Dean Stephens
I’ve been thinking a lot about “big data” and how it is going to affect the practice of medicine. It’s not really my area of expertise– but here are a few thoughts on the tricky intersection of data mining and medicine.
First, some background: these days it’s rare to find companies that don’t use data-mining and predictive models to make business decisions. For example, financial firms regularly use analytic models to figure out if an applicant for credit will default; health insurance firms can predict downstream medical utilization based on historic healthcare visits; and the IRS can spot tax fraud by looking for fraudulent patterns in tax returns. The predictive analytic vendors are seeing an explosion of growth: Forbes recently noted that big data hardware/software and services will grow at a compound annual growth rate of 30% through 2018.
Big data isn’t rocket surgery. The key to each of these models is pattern recognition: correlating a particular variable with another and linking variables to a future result. More and better data typically leads to better predictions.
It seems that the unstated, and implicit belief in the world of big data is that when you add more variables and get deeper into the weeds, interpretation improves and the prediction become more accurate.Continue reading…
The health care field is in the grip of a standard that drains resources while infusing little back in return. Stuck in a paradigm that was defined in 1893 and never revised with regard for the promise offered by modern information processing, ICD symbolizes many of the fetters that keep the health industries from acting more intelligently and efficiently.
We are not going to escape the morass of ICD any time soon. As the “I” indicates in the title, the standard is an international one and the pace of change moves too slowly to be clocked.
In a period when hospitals are gasping to keep their heads above the surface of the water and need to invest in such improvements as analytics and standardized data exchange, the government has weighed them down with costs reaching hundreds of thousands of dollars, even millions just to upgrade from version 9 to 10 of ICD. An absurd appeal to Congress pushed the deadline back another year, penalizing the many institutions that had faithfully made the investment. But the problems of ICD will not be fixed by version 10, nor by version 11–they are fundamental to the committee’s disregard for the information needs of health institutions.
Disease is a multi-faceted and somewhat subjective topic. Among the aspects the health care providers must consider are these:
Disease may take years to pin down. At each visit, a person may be entering the doctor’s office with multiple competing diagnoses. Furthermore, each encounter may shift the balance of probability toward some diagnoses and away from others.
Disease evolves, sometimes in predictable ways. For instance, Parkinson’s and multiple sclerosis lead to various motor and speech problems that change over the decades.
Diseases are interrelated. For instance, obesity may be a factor in such different complaints as Type 2 diabetes and knee pain.
All these things have subtle impacts on treatment and–in the pay-for-value systems we are trying to institute in health care–should affect reimbursements. For instance, if we could run a program that tracked the shifting and coalescing interpretations that eventually lead to a patient’s definitive diagnosis, we might make the process take place much faster for future patients. But all a doctor can do currently is list conditions in a form such as:
E66.0 – Obesity due to excess calories
E11 – Type 2 diabetes mellitus
M25.562 – Pain in left knee
The tragedy is that today’s data analytics allow so much more sophistication in representing the ins and outs of disease.Take the issues of interrelations, for instance.
The field of analytics has fallen into a few big holes lately that represent both its promise and its peril. These holes pertain to privacy, policy, and predictions.
Policy. 2.2/7. The biggest analytics project in recent history is the $6 billion federal investment in the health exchanges. The goals of the health exchanges are to enroll people in the health insurance plans of their choice, determine insurance subsidies for individuals, and inform insurance companies so that they could issue policies and bills.
The project touches on all the requisites of analytics including big data collection, multiple sources, integration, embedded algorithms, real time reporting, and state of the art software and hardware. As everyone knows, the implementation was a terrible failure.
The CBO’s conservative estimate was that 7 million individuals would enroll in the exchanges. Only 2.2 million did so by the end of 2013. (This does not include Medicaid enrollment which had its own projections.) The big federal vendor, CGI, is being blamed for the mess.
Note that CGI was also the vendor for the Commonwealth of Massachusetts which had the worst performance of all states in meeting enrollment numbers despite its long head start as the Romney reform state and its groundbreaking exchange called the Connector. New analytics vendors, including Accenture and Optum, have been brought in for the rescue.
Was it really a result of bad software, hardware, and coding? Was it that the design to enroll and determine subsidies had “complexity built-in” because of the legislation that cobbled together existing cumbersome systems, e.g. private health insurance systems? Was it because of the incessant politics of repeal that distracted policy implementation? Yes, all of the above.
The big “hole”, in my view, was the lack of communications between the policy makers (the business) and the technology people. The technologists complained that the business could not make decisions and provide clear guidance. The business expected the technology companies to know all about the complicated analytics and get the job done, on time.
This ensuing rift where each group did not know how to talk with the other is recognized as a critical failure point. In fact, those who are stepping into the rescue role have emphasized that there will be management status checks daily “at 9 AM and 5 PM” to bring people together, know the plan, manage the project, stay focused, and solve problems.
Walking around the hole will require a better understanding as to why the business and the technology folks do not communicate well and to recognize that soft people skills can avert hard technical catastrophes.
HIMSS has opened and closed in Florida and I’m in Boston with snow up to my rectus abdominis. After several years of watching keynote pageants and scarfing up the amenities at HIMSS conferences, I decided to stay home this year.
In general, I’ve found that my attendance at HIMSS leads moaning and carping about the state of health IT. So this year I figured I could sit in my office while moaning and carping about the state of health IT.
In particular, my theme this year is how health IT is outrunning the institutions that need it, and what will happen to those left behind.
The scissors crisis: more IT expenditures and decreasing revenues
Although the trade and mainstream press discuss various funding challenges faced by hospitals and other health providers, I haven’t seen anyone put it all together and lay out the dismal prospects these institutions have for fiscal health. Essentially, everything they need to do in information technology will require a lot more money, and all the income trends are declining.
Certainly the long-term payoff for the investment in information technology could be cost reductions–but only after many years, and only if it’s done right. And certainly, some institutions are flush with cash and are even buying up others. What we’re seeing in health care is a microcosm of the income gap seen throughout the world. To cite Billie Holliday: them that’s got shall get; them that’s not shall lose.
Here are the trends in IT:
Meaningful Use requires the purchase of electronic health records, which run into the hundreds of thousands of dollars just for licensing fees. Training, maintenance, storage, security, and other costs add even more. The incentive payments from the federal government come nowhere near covering the costs. EHR providers who offer their record systems on the Web (Software as a Service) tend to be cheaper than the older wave of EHRs. Open source solutions also cost much less than proprietary ones, but have made little headway in the US.
Hey there, maybe THCB readers can weigh in on this one. I work at a healthcare startup. Somebody I know who works in medical billing told me that several big name insurers they know of are using analytics to adjust reimbursement rates for medical billing codes on an almost daily and even hourly basis (a bit like the travel sites and airlines do to adjust for supply and demand) and encourage/discourage certain codes. If that’s true, its certainly fascinating and pretty predictable, I guess.
I’m not sure how I feel about this. It sounds draconian. On the other hand, it also sounds cool. Everybody else is doing the same sort of stuff with analytics: why not insurers? Information on this practice would obviously be useful for providers submitting claims, who might theoretically be able to game the system by timing when and how they submit. Is there any data out there on this?
Is this b.s. or not?
Lost in the health care maze? Having trouble with your health Insurance? Confused about your treatment options? Email your questions to THCB’s editors. We’ll run the good ones as posts.
In an article posted earlier this year on this blog I argued that hospitals have traditionally done a sub-par job of leveraging what has now been dubbed “big data.” Effectively mining and managing the ever rising oceans of data presents both a major challenge – and a significant opportunity – for hospitals.
By doing a better of job connecting the dots of their big data assets, hospital management teams can start to develop the crucial insights that enable them to make the right and timely decisions that are vital to success today. And, better, timelier decisions lead to improved results and a higher level of quality patient care.
That’s the good news. The less than positive story is that hospitals are still way behind in using the mountains of data that are being generated within their institutions every day. Nowhere is this more apparent than in the advanced data management practice of predictive modeling.
At its most basic, predictive modeling is the process by which data models are created and used to try to predict the probability of an outcome. The exciting promise of predictive modeling is that it literally gives hospitals the ability to see into (and predict) the future. Given the massive changes and continuing uncertainty that are buffeting all sectors of the healthcare industry (and especially healthcare providers), having a clearer future view represents an important strategic advantage for any hospital leader.
In a piece just posted at TheAtlantic.com, I discuss what I see as the next great quest in applied science: the assembly of a unified health database, a “big data” project that would collect in one searchable repository all the parameters that measure or could conceivably reflect human well-being.
I don’t expect the insights gained from these data will obsolete physicians, but rather empower them (as well as patients and other stakeholders) and make them better, informing their clinical judgment without supplanting their empathy.
I also discuss how many companies and academic researchers are focusing their efforts on defined subsets of the information challenge, generally at the intersection of data domains. I observe that one notable exception seems to be big pharma, as many large drug companies seem to have decided that hefty big data analytics is a service to be outsourced, rather than a core competency to be built. I then ask whether this is savvy judgment or a profound miscalculation, and suggest that if you were going to create the health solutions provider of the future, arguably your first move would be to recruit a cutting-edge analytics team.
The question of core competencies is more than just semantics – it is perhaps the most important strategic question facing biopharma companies as they peer into a frightening and uncertain future.