Uwe Reinhardt said it perfectly in a Tuesday plenary but I can only paraphrase his point: “health information is a public good that brings more wealth the more people use it.” Or, as Doc Searls puts it: personal data is worth more the more it is used. Datapalooza is certainly the largest meeting of the year focused on health data, and our Health and Human Services data liberation army was in full regalia. My assessment is: so far, so good but, as always, each data liberation maneuver also reveals the next fortified position just ahead. This post will highlight reciprocity as a new challenge to the data economy.
The economic value of health data is immense. Without our data it’s simply impossible to independently measure quality, get independent second opinions or control family health expenses. The US is wasting $750 Billion per year on health care which boils down to $3,000 per year that each man, woman and child is flushing down the drain.
Data liberation is a battle in the cloud and on the ground. In the cloud, we have waves of data releases from massive federal data arsenals. These are the essential roadmap or graph to guide our health policy decisions. I will say no more about this because I expect Fred Trotter (who is doing an amazing job of leading in this space) will cover the anonymous and statistical aspects of the data economy. Data in the cloud provides the basis for clinical decision support. Continue reading…
You probably saw some of the headlines last week where Box announced that is supporting HIPAA and HITECH compliance, signing Business Associate Agreements, (BAAs) and integrating with several platform app partners such as Doximity, drchrono, TigerText, and Medigram to help seed its new healthcare ecosystem. I also announced that I was formally advising Box on their healthcare strategy.
I was drawn to Box because of all the lessons I learned at Google building a consumer-directed, personal health record (PHR), Google Health. Google Health allowed you to securely store, organize and share all of your medical records online and control where your data went and how it was managed. It was unlike the other PHRs in the industry that were tethered to the provider or payor or part of an Electronic Health Record (EHR) system.
Sound good? Well, it was in theory. The big issue with Google Health was aggregating your data from the disparate sources that stored data on you. We had to create a ton of point-to-point integrations with large health insurance companies, academic medical centers, hospitals, medical practices and retail pharmacy chains. All of these providers and payors were covered entities in the world of HIPAA and were required to verify a patient’s identity before releasing any data to them electronically. It was a very bumpy user experience for even the most super-charged, IT savvy consumer.
There aren’t many who would quibble with an argument that those with severe mental illness—specifically, individuals “who have been involuntarily committed to a mental institution, found incompetent to stand trial or not guilty by reason of insanity. or otherwise have been [legally judged] to have a severe mental condition that results in the individuals presenting a danger to themselves or others“—should not be able to purchase firearms. Right? Right.
Making that law isn’t actually the trouble (expanding background checks is, of course, a different story). It’s already law, and has been on the books for awhile. The trouble is enforcing it.
The federal government maintains the National Instant Criminal Background Check System (NICS), a database of people who are federally prohibited from purchasing guns, including felons, people convicted of domestic violence, and individuals who meet the extreme mental illness criteria above. Except:
Federal law does not require State agencies to report to the NICS the identities of individuals who are prohibited by Federal law from purchasing firearms, and not all states report complete information to the NICS.
To recap: We have federal criteria that prohibits certain individuals from buying firearms. The feds maintain a database of known individuals for background checks (which take 30 seconds, per the regulation). But states aren’t required to offer the names of “prohibitors” to the database.
I am affiliated with the institution where Dzhokhar Tsarnaev is currently hospitalized. I am friends with people who have treated him. I’m trying to stay away from those people; I would be unable to help asking them about him. They might be unable to help talking about him. There has been a flurry of emails and red-letter warnings cautioning people here not to talk about Mr. Tsarnaev or look him up on the EMR (Electronic Medical Record) system. Despite this there have been leaks of information and photos from various sources. It is virtually impossible to keep people from asking about him and talking about him. Curiosity is human nature. When human nature comes up against morals and laws, human nature will win a good percentage of the time. The question is: given what he has done, does this 19-year-old still have his right to privacy?
The answer, of course, is yes. The American Medical Association includes patient confidentiality in it’s ethical guidelines:
“…the purpose of a physicians ethical duty to maintain patient confidentiality is to allow the patient to feel free to make a full and frank disclosure of information…with the knowledge that the physician will protect the confidential nature of the information disclosed.”
Threre are legal guidelines as well, most notably with the Health Insurance Portability and Accountability Act, or HIPAA. This law was originally passed in 1996 to improve the efficiency and effectiveness of the health care system, allow people to switch jobs without losing their health insurance, and impose some rules on electronic medical information. Congress incorporated into HIPAA provisions that mandate the adoption of the Federal privacy protections for health information. The “simplified” administrative document for the privacy and security portions of HIPAA is 80 pages long. Basically your health information cannot be shared with ANYONE. Of course, there are exceptions to HIPAA. Continue reading…
I’m sure you get a lot of hate mail, especially from folks in my profession, so when you got this letter from me you probably assumed it was more of the same. Let me reassure you: I am not one of those docs. I do think patient privacy is important, and actually found you quite useful when facing unwanted probing questions from family members. I believe the only way for patients to really open up to docs like me is to have a culture of respect for privacy, and you are a large part of that trust I can enjoy. Yeah, there was trust before you were around, but that was before the internet, and before people used words like “social media,” and “data mining.”
But there have been things done in your name that I’ve recently come in contact with that make me conclude that either A: you are very much misunderstood, or B: you have a really dark side.
The latest news story to examine the issue of patient access to implantable cardiac defibrillator data (a variation on the theme of “gimme my damn data”) is an in-depth, Page One Wall Street Journal story featuring Society for Participatory Medicine members Amanda Hubbard and Hugo Campos. They have garnered attention in the past – one example is another piece on Hugo on the NPR Shots blog about six months back. The question posed by these individuals is simple — May I have access to the data collected and/or generated by the medical device implanted in my body? — but the responses to the question have been anything but. It is important to note that not every patient in Amanda’s or Hugo’s shoes would want the data in as detailed a format as they are seeking to obtain, and we should not impose the values of a data-hungry Quantified Self devotee on every similarly-situated patient. Different strokes for different folks.
The point is that if a patient wants access to this data he or she should be able to get it. What can a patient do with this data? For one thing: correlate activities with effects (one example given by Hugo is his correlation of having a drink of scotch with the onset of an arrhythmia — correlated through manual recordkeeping — which led him to give up scotch) and thereby have the ability to manage one’s condition more proactively.
A time-and-technology challenged FDA, proliferation of software-controlled medical devices in and outside of hospitals, and growth of hackers have resulted in medical technology that’s riddled with malware. Furthermore, lack of security built into the devices makes them ripe for hacking and malfeasance.
Scenario: a famous figure (say, a politician with an implantable defibrillator or young rock star with an insulin pump) becomes targeted by a hacker, who industriously virtually works his way into the ICD’s software and delivers the man a shock so strong it’s akin to electrocution.
Got the picture?
Welcome to the dark side of health IT and connected health. Without strong and consistently adopted security technology and policies, this scenario isn’t a wild card: it’s in the realm of possibility. This is not new-news: back in 2008, a research team figured out how to program a common pacemaker-defibrillator to transmit a “deadly 830-volt jolt,” according to Barnaby Jack, a security expert.
·The patient to whom it refers?
·The health provider that created it?
·The IT specialist who has the greatest control over it?
The notion of ownership is inadequate for health information. For instance, no one has an absolute right to destroy health information. But we all understand what it means to own an automobile: You can drive the car you own into a tree or into the ocean if you want to. No one has the legal right to do things like that to a “master copy” of health information.
All of the groups above have a complex series of rights and responsibilities relating to health information that should never be trivialized into ownership.
Over the past decade, I’ve seen a number of studies asking people whom they trust among various health care stakeholders. Nurses, pharmacists, and doctors always come out at the top. Beyond that:
·Trust of hospitals tends to be high (60–80%)
·Trust of health plans is at the bottom of the heap (10–20%)
Is this written in stone for the future? I don’t think so…and the dynamics for change are in motion. Please read on.
Here’s the emerging picture I’m seeing:
·Hospitals are dragging their feet in connecting you with your electronic health information.
·Health plans are highly motivated to connect you with your health information.
Hospitals Keeping You from Your Health Records
Yesterday the American Hospital Association released a 68 page letter commenting on proposed regs for Meaningful Use Stage 2. Putting aside my usual analytic tendencies, I’ll simply describe the letter as whiny, snivelly, “can’t do”, mean, and thick-headed.