Healthcare’s Privacy Problem (Hint: It’s Not What You Think It Is...

Healthcare’s Privacy Problem (Hint: It’s Not What You Think It Is )

16
SHARE

Picture 27 I recently applied for life insurance. The broker, whom I’ve never met, asked about my health history. “So you’ve just had a baby,” he began. I asked him how he knew. “You’re on Twitter.”

In the last couple of years concerns about the privacy of online health information have grown, as health care finally catches up to other sectors in its use of information technology (IT). The Stimulus package will pump $19.2 billion into healthcare IT, especially electronic medical records for doctors.

While technology can make your medical records safer in some ways than they’d be in a paper chart (using encryption, fire walls, audit trails, etc.), the fact is, no system is totally fail-safe. And when screw-ups happen, technology tends to super-size them.

A few advocates say the main fix is for people to have as much choice (or “consent”) as possible about sharing particular tidbits of health info. Not a bad place to start, but relying too much on consent is impractical and burdensome.  We also need limits on who can use the info in your record, and for what. The use of these and other widely accepted “fair information practices” will go a long way when it comes to safeguarding the medical record your doctor holds.

But wait. Your virtual health record is so much bigger than that. It’s the iceberg beyond the traditional medical record tip. Part of it is the verbal trail you create on Facebook, on Twitter (evidently!), in an online patient community, via web searches, or on e-mail. But it’s also what you do–what you buy at the grocery store, how fast you drive, maybe even who you talk to on the phone. Right now, most of that information isn’t easily publicly available, and it isn’t linked, but more of it will be. There are plenty of incentives for companies to understand and influence the minutia of your daily life.

The iceberg of health information about you is growing. Recently scientists determined that Agatha Christie suffered from Alzheimer’s just by analyzing the vocabulary in her novels.

As more and more data about each of us is generated, including through tiny sensors that will increasingly be used in clothing and other products, there is more information to glean from it—about our physical health, actions, and even mental health. The MIT Media Lab is working on computer programs that can “read” head movements and facial expressions to understand emotions.

Eventually, the traditional medical record may pale relative to the vast stores of information about your health that can be found in nontraditional ways.  So when we think about health privacy we need to recognize that safeguarding the traditional medical record is only the start. The best policy approaches also protect against discrimination and its consequences. So despite the banners and screams of “Socialist State” in my neighborhood in Washington DC (ooops, there’s more personal information!) the Health Reform Bill, if implemented well—is a strong and necessary step toward protecting individuals against an unavoidable erosion of their health privacy.

Lygeia Ricciardi is the founder of Clear Voice Consulting (www.clear-voice.com) and part of the leadership team of Clinovations (www.clinovations.com) She specializes in strategy, policy and implementation of health IT–with a passionate focus on the consumer. And yes, she is on Twitter: @Lygeia

Leave a Reply

16 Comments on "Healthcare’s Privacy Problem (Hint: It’s Not What You Think It Is )"


Guest
Mar 25, 2010

Very well stated! There is some of this type of policy in existence now (e.g. in GINA). But, I’m don’t know how the Health Reform Bill provides any of these types of protections. Any pointers?

Guest
Mar 25, 2010

There’s also the MIB, the Medical Insurance Bureau that has been around for years that has an exchange of files from insurance carriers sharing information.
http://ducknetweb.blogspot.com/2008/08/what-is-mib-medical-insurance-bureau.html

Guest
Michael, T
Mar 25, 2010

There is a second “privacy” concern around the use of our medical data that is often over looked – to influence your providers behavior
Many people don’t realize that drug benefit companies harvest 95% of all prescriptions written and then combine that with the AMA data base to target market to providers without breaking any privacy laws. Drug reps often know more about your providers prescribing practices.
There are also now EHR’s like Practice Fusion that deliver ads directly to your providers EHR (you get a “free” EHR in exchange) and other firms that use the check in process to capture patients information and deliver ads directly to patients in the waiting room. None of this is based on standard of care but simply on who purchases the ads.
Just as an aside some firms are also now using your RX history as a surrogate for your medical risk when applying for a mortgage or loan. IE if you have a certain high risk or chronic conditions you may be required to pay higher interest rates on loans.

Guest
Mar 25, 2010

Biometric devices also contribute to medication data that is for sale. More details and additional links here with entities not covered by HIPAA.
http://ducknetweb.blogspot.com/2010/03/colbert-report-takes-on-vitality.html

Guest
Mar 25, 2010

First off, Congratulations!
Second, I’m curious which parts of the bill you think will help ensure privacy, just because of non-discrimination for pre-existing conditions, or is there more than that?
Third, you bring up points about other areas, including employment. I frequently see advice saying not to share “controversial” thoughts online (presumably this would include the words I’m typing) for fear that employers could use it to actively discriminate against job-seekers. The assumption seems to be that employers are actively discriminating based on political affiliation, religion, etc. using social media. The laws exist, but currently there’s no realistic way to enforce them. Do you have any suggestions on what a solution to enforcement might look like?
Finally, Great post! I’ve often looked at the problem of health data from the access perspective rather than the use perspective (see my post last Sunday). There are insights that could be gleaned if all clinical data were on a standardized system. It’s a dream at present, but focusing on use could hold the key to more sharing data on a trusted platform, available to use for the right (non-discriminatory) reasons.

Guest
Mar 25, 2010

Hi Lygeia
I’d not thought about this aspect of healthcare privacy but now you’ve pointed it out I’m pondering the implications for those people who religiously update their Facebook with news of late nights and crushing hangovers!
One day they may wish they’d kept digitally quiet about such matters; maybe I’ll wish I’d ‘de-friended’ them to avoid being found guilty by association.
So many people worried what other people write about them but the real problem could be what they’re writing about themselves…
Thank you for a fascinating post.

Guest
MG
Mar 25, 2010

I find it so fascinating that Americans seem so fascinated by this privacy issue yet are willing to allow much great extensions & monitoring of their privacy for ‘national security’ purposes including the high likelihood that most of their electronic communications (email, cell phone, bank transactions) are passively monitored already to some degree.

Guest
Mar 25, 2010

Lygeia,
Very insightful: “the traditional medical record may pale relative to the vast stores of information about your health that can be found in nontraditional ways.”
So, now what?
Does the “healthcare privacy/security” issue need to be simply redefined in a much broader context of “personal” privacy security?
Do more traditional concerns about healthcare security/privacy become moot?
What can Joe the Plumber do after he recognizes the problems you point out? should we all become hermits?
Do Google et. al. make George Orwell’s 1984 look passe?
Thanks for pointing out and opening the real can of worms :)

Guest
SR
Mar 25, 2010

Rape Victims Choice – Risk AIDS or Insurance in the Future
One of the most difficult privacy cases often happens to women who have been raped. If they take the recommended prophylactic AIDS drugs to prevent HIV they will essentially make themselves uninsurable in the future.
http://huffpostfund.org/stories/2009/10/rape-victims-choice-risk-aids-or-health-insurance

Guest
Mar 25, 2010

Having life insurance is now priority for now, but we must accept that the payments are not astronomical as it is implementing a system where the insurance point of living is for everyone, but with an appropriate cost, according to this system will be implemented soon for all citizens still uninsured.

Guest
Mar 25, 2010

Thanks very much for the insightful comments and responses!
Regarding Health Reform, my thought was not that it protects privacy per se, but that in providing access to health insurance coverage to a greater proportion of Americans it minimizes the impact of discrimination in one of the areas in which it tends to hit hardest.
I wish I knew “the solution” to the challenge of the erosion of privacy. For most of us it isn’t climbing into your shell to avoid leaving any trail that might come back to haunt you, which is practically impossible anyway (though you should think carefully about those party pictures on Facebook).
I believe that law and policy are part of the answer. I agree with Dave that GINA is one of the best examples of anti-discrimination law. I would like to see more laws in that model—though I realize that enforcement is tricky at best.
In addition, although greater transparency is the problem, it can also be part of the solution. Even now, most companies that handle sensitive health data are motivated to avoid breaches because they don’t want to lose customer trust. That’s a lot scarier to them than the (often remote) possibility of HIPAA enforcement. Transparency can help companies—and individuals—to act more decently than we might otherwise.
I am with Vince that this brave new world has certain Orwellian characteristics. Though I don’t believe there is a simple answer, I want to make sure that we don’t get so obsessed arguing over particular trees in the policy discussion that we miss the forest altogether.

Guest
Mar 25, 2010

While I believe that anyone who thinks that his/her electronically stored information is 100% private is a bit delusional, I think you’re conflating several different privacy discussions.
1. Lifestreaming Millenials. These kids (and their millenial wannabe elders) post too much stuff about everything – partying too hard is just the tip of the iceberg. They just need to be more circumspect, becasue it will all come back to bite them – in a job interview (in some cases, they’ll never get that interview) or otherwise. We consent to websites’ terms of service, which include (in some cases) descriptions of the “walled communities” within which posts are visible, but there are no enforceable promises to keep anything private.
2. Sensors. Whether it’s the Nike chip in our running shoes that connects us to a community of runners (kinda like #1, because it’s volitional) or the video images culled from cameras in public places and analyzed (why would they bother, for most of us?) by the MIT Media Lab or others (a little creepier, and less volitional), this is all information that we essentially choose to share by engaging with the sensors that are around us (vs. moving to a cabin in the great north woods). Once you put the chip in your shoe, or walk around in public where we all know all sorts of cameras and sensors are picking up some sorts of data about us, we’ve consented to the gathering and sharing of that data.
3. EHRs and PHRs. PHRs are sort of a special-purpose Facebook with very limited numbers of friends. Again, we use these tools because we get some value out of them, and (putting my pointy-headed lawyer’s hat on for a moment) assume the risk of the records becoming more public than intended. EHRs are the tools used by our health care providers, so (in the future) unless we get our care from the local shaman in the great north woods, our health records will be on line, in (or accessible through) the cloud. I always say that since I’m not Britney Spears, I don’t think anyone is going to be very interested in my health records. Making data available through these tools is more helpful (both to me and to the extent it can be aggregated into population-based studies and development of recommended standards of care) than harmful.
My point is that there is a continuum of choice that underlies the discussion here — some data sharing we choose, some we have no say over. The key, from my perspective, is (a) better-informed, more thoughtful personal decisionmaking about sharing of information and (b) the enactment — and, more importantly, the enforcement — of antidiscrimination laws such as GINA.

Guest
john
Mar 25, 2010
Guest
Mar 25, 2010

I agree that privacy of any personal data should be encouraged through any and all legislative and legal means, but I also believe that many who trumpet privacy issues for whatever reason overdo the issue substantially.
I write this primarily because for most intents and purposes personal data have always been relatively easily obtainable even without the convenience of fast electronic retrieval enabled by evermore digitization of data and the Internet. Those data are rarely, if ever, accessed for the simple reason that by and large no one cares about having or knowing the data.
Concerns are raised, for example, that a potential employer might become aware of some medical matter for a potential employee. My reaction is: so what?
If an employee at that company first of all makes an effort to obtain the data and second to act on the data in some negative way, i.e. use the data as a basis for excluding the job seeker from employment, why would the job seeker want employment at that company to begin with?
This applies to all potential destinations where private data might leak. Negative consequences either do not exist at all or are irrelevant.

Guest
May 14, 2010

health knowledge