By now most people have heard of the new mobile phone game, Pokemon-Go. Pokemon-Go uses cellphone GPS data to identify when you are in the mobile game and allow Pokémon characters to “magically appear” in areas around you (through your phone screen). As you move around, different types of Pokémon will appear for you to catch. The idea is to encourage players to travel around their geographic location in order to catch Pokémon. This game provides a glimpse into an approaching next wave of personal wellness and patient engagement applications that will likely incorporate augmented reality into the mainstream consciousness and imagination.
Augmented reality games provide a twist on geocaching. I have gone on geocaching trips with my kids and generally enjoyed the pleasure of getting eaten alive by mosquitos while looking under every rock in a quarter mile for a box filled with a couple of dirty action figures. I did this voluntarily as it was one of the many ways to increase physical activity and get my kids engaged.
Augmented reality games, such as Pokémon-go have showed innovation for the virtual world and mobile computing. These type of games have the ability to be a better option for the future of computing over virtual reality. If instances of augmented reality games utilize gaming to create interest, a game could be created to encourage physical movement to complete tasks. As time progresses we may see a rush to capitalize on augmented reality now that an application has shown how it can be integrated into our daily lives.
First of all I have to admit that I am a convert and not an original believer in the Data Lake and late binding approaches to data analytics. I do not think it is my fault or at least I have a defense of sorts. I grew up in a world where my entrepreneurial heroes were people like Bill Gates, Larry Ellison, and Steve Jobs and it seemed that structured systems like operating systems that allowed many developers to work against a common standard were the way to go.
In my last blog I riffed on prospect theory and how it applies to health care data sharing. In essence, prospect theory suggests two categories:
1. People are extremely unwilling to accept risk when the consequences are unknown (patients avoid sharing data if they don’t know how it will benefit or harm them)
2. People are more willing to accept the risk when the reward is achievable, and the alternative is very harmful (patients with severe illnesses would readily share data when there is a possibility it could save their life or eliminate significant suffering)
Scenario 1, risk adversion, is more common across all constituents including providers, healthy patients and families, political leaders and philanthropists. Generally, the benefits of sharing health care data are foggy and unclear, while the alternatives to keeping the data private are not life threatening.
People are more likely to avoid loss than to seek gains. HIPAA creates a framework where it rewards risk adverse behavior for data sharing even when data sharing would ultimately be beneficial to the enterprise, the mission, and the patients. This is a general issue at the heart of making progress in healthcare regarding data sharing and interoperability. I have some new thoughts on how to bridge this divide.
Recently I read the book ‘Thinking, Fast and Slow’ by the Nobel Prize winning economist Daniel Kahneman. This book discusses the concept of Prospect Theory. In reading through it I could see a hint of why our industry has so much trouble trying to share medical records and in general has trouble sharing almost anything among trading partners and competitors. If you haven’t read about Prospect Theory, the following tests provide some of the basics into how humans make decisions about risk.
Decision 1: Which do you choose? Get $900 for sure OR 90% chance to get $1,000
Decision 2: Which do you choose? Lose $900 for sure OR 90% chance to lose $1,000″[i]
The common answer to #1 is to take the $900. The common answer to #2 is to take the 90% chance to avoid the loss. As a result, we take risks to avoid danger but avoid risks when we see certain rewards. This behavior is relevant to data sharing and access to PHI and can be instructive on how people will approach risk.
It’s been an exciting 2016 already in the realm of cloud computing and patient engagement. As I was preparing for the HIMSS16 conference, I was reflecting on how things are moving so quickly with the addition of new technologies and yet some of the core challenges around gathering the information to provide better medicine are still in the dark ages. So here is the question ringing in my head for this year at HIMSS…
How much longer must we wait to finally have a ‘patient cloud’ – a sharable and relatively complete cloud based health record for each patient?
This is seemingly an obvious prerequisite condition so that providers can deliver better care for patients. The patient controlled medical record is an old idea that goes back to the Guardian Angel manifesto published in 1994 at the dawn of the Internet era and yet 22 years later we have haven’t achieved the first steps of the fundamental core of a universal life long patient record.
We have been talking about Precision Medicine for a long time now but so far we are still in the infancy of using genetics to impact medical decision making. The human genome was sequenced in 2003,with the promise of rapid medical advances and genetically tailored treatments. However, development and adoption of these treatments has been slow. Today with the advent of large cohorts, and in particular, the construction of the US Government’s Precision Medicine Cohort, conditions are being set up for precision medicine to flourish. In the PMI infographic,it states three reasons for ‘Why now?’ – sequencing of the human genome, improved technologies for biomedical analysis, and new tools for using large data sets. While I agree this is progress, I believe there area few fundamental other areas to be tackled in order to really get to the promise of precision medicine.
The concept of protocols is designed for mass production not mass personalization
Medicine is practiced using protocols and documented in EHRs. When written down they can look like a cook book or a choose-your-own-adventure story book. The intent is to codify medical knowledge into a guide that can be consistently used by all physicians to obtain the ideal outcome. But as a result of strict adherence to medical practice standards, they inherently choose paths that are well defined based on the assumption that most people are similar in their response to treatments. So over time the protocol is enhanced to suggest which decision is the right one to make with a given patient and a protocol will more often than not be ‘anti-precision,’ just in the same way that a factory is designed to make one size of jeans at a time rather than make a custom set of jeans for each customer visiting the store.
I have spent several years working with specialty medical offices like oncology centers, diabetes clinics, IPAs (Independent Practice Associations), and disease advocacy groups seeking to build health care data warehouses and analytics solutions. During that time, I have seen the same concerns pop up over and over: “How can we understand the value and impact of our care if we only see the component of care that we provide? If we can’t understand our value, then how can we make sure that we are optimizing our care, getting reimbursed for our impact, and executing leading research in our specialties that helps find better medical treatments for our patients? How can we really care for patients effectively in the first place?”
Organizations are highly restricted in the ways they share data. HIPAA allows for data sharing between entities, but doesn’t provide for any mechanism or incentives to do so efficiently or in a scalable method. Also, the groups who should be sharing may find themselves in competitive situations where sharing could be perceived as risky. But in spite of this, some exciting developments have quietly been moving forward in the past few years that can help fill in pieces of the data last mile.
The rise of Meaningful Use 2 (MU2) compliant electronic medical records (EMR) with the objective to enable health information exchange (HIE) between systems now represents a potential solution to this challenge that has been exacerbating the fragmentation of the health care industry for years. Public HIEs have not yet demonstrated that they can resolve analytics issues or workflow changes. Instead, there are some new and useful models of HIE that show great promise that are likely being adapted from the lessons learned from the original HIE designs.