Uncategorized

Are We Adequately Securing Personal Health Information?

In a discussion about electronic health records (EHRs) a couple weeks ago, one of the Human Resource team members at a prospective client said, “I don’t believe it’s possible to secure electronic health data. It’s always an accident waiting to happen.”

There is some truth to that. More and more, our Personal Health Information (PHI) is in electronic formats that allow it to be exchanged with professionals and organizations throughout the health care continuum. It is highly unlikely that each contact point has the protections to wrap that data up tightly, away from those who would exploit it.

Of course, PHI is among the richest examples of personal data, often with all the key ingredients prized by identify thieves: social security number, birthday, phone numbers, address, and even credit card information. This should give health care organizations considerable pause.

Then consider that, while paper charts contain the same information, electronic files often aggregate hundreds of thousands or even millions of records, information treasures troves for someone really focused on acquiring, mining and making use of the data.

Which is what makes a new health data security survey commissioned by Kroll Fraud Solutions and conducted by HIMSS Analytics, so provocative. As they had in 2008, HIMSS Analytics found that most provider organizations meticulously comply with data security rules and standards. But they’re overly confident about the security that compliance actually conveys. Worse, many remain unaware, until confronted by an event, of the devastating implications of even a minor breach.

And the threat is intensifying as the market and technology evolve. In 2010, 19 percent of organizations reported a breach, half-again higher than the 13 percent in 2008. Apparently, both the complexity of the environment and the interest in the data are growing. Security may be diminishing as a result.

And breaches can be hugely costly. A Poneman Institute study found an average cost of $6.75 million for organizational data breaches. This figure is not limited to incidents with malicious origins or even harmful consequences. In January 2009, the Department of Veterans Affairs agreed to pay $20 million to veterans who could show they were hurt when, in 2006, a VA data analyst lost a laptop containing information on 26.5 million patients, nearly every living veteran. The laptop was eventually recovered without apparent data compromise. The VA is now struggling with a new, serious health data breach.

Nor is the impact likely to be financial alone. The larger cost may simply be in the loss of patient confidence. After all, if an organization can’t competently manage my data, do I want to hand over management of my family’s health?

Perhaps the HIMSS Analytics’ study’s most important and penetrating finding is that “health care organizations continue to think of data security in specific silos (IT, employees, etc.) and not as an organization-wide responsibility, which creates unwanted gaps in policies and procedures.”  Nearly 9 in 10 survey respondents said they have policies in place to monitor access to and sharing of health care information. But more than four-fifths of breaches occur in more mundane ways: e.g., lost/stolen laptops, improper document disposal, stolen tapes. In other words, the holes can’t be addressed by isolated approaches.

Security is a process, not a product. This means that certification of PHI security must be larger than merely plugging the security gaps in information technology, and must extend to the ways that people access and use information and the information technology.

It is clear that the answers here involve making heath data security an enterprise-wide responsibility, creating highly aware environments resistant to breach in even the most seemingly insignificant interactions. That will demand a significant cultural shift, critically necessary but, as this survey shows, difficult for many organizations’ leaders to wrap their heads around.

Brian Klepper, PhD and David C. Kibbe, MD MBA write together on health care innovation, technology and market dynamics.

14 replies »

  1. Hi David,
    I’m delighted to see you have moved from the rather casual approach you espoused a few months ago regarding the security and privacy of medical records. (I’m referring to your anecdote about a North Carolina banker friend and people’s gradual acceptance of electronic banking transactions.)
    Privacy and security have been, are and will continue to be a major and ever growing concern of consumers. It’s about time they become a major concern of providers, vendors and government agencies.
    As you point out, breaches and thefts of electronic records of all sorts — including EMRs — are accelerating at an alarming rate and affecting millions of people in very harmful ways.
    Regrettably, I am not as optimistic as you that we can change people’s behavior towards record confidentiality and security. I believe there always will be people who want to access records of others for their own purposes — whether to simply read them, sell them and/or misuse them. Accordingly, we must try to prevent such breaches and thefts.
    My solution to this serious problem is simple – though some may think it draconian or Luddite-like! But whatever you label it, it will do more to protect the privacy and security of patient medical records — even while making them available when and where they are needed — than what we have on today’s drawing boards.
    First, continue to encourage care providers to adopt EMR systems but do NOT link them into large networks. Merely creating electronic silos creates serious and complex security issues and requires tight control procedures. Linking them together in ever-larger networks – from care provider practice, to RHIO, to HIE, to NHIN, to cloud or HealthVault or Google Health platforms – magnifies the security problems many times over.
    Second, aggregate a patient’s medical records in the hands and control of the patient. When the patient requires care, he/she merely gives their aggregated record to the care provider treating them and enters a password giving the provider access to specific records. This way care providers will have access to a patient’s complete record without the security risks that accompany electronic networking.
    (In the interest of full disclosure and as you know, the MedKaz™ System we are developing is just such a patient-focused system.)

  2. David and Brian –
    I’m sitting here at Health 2.0 in Paris and saw a tweet go by on #EHR that pointed me to this excellent post at The Health Care Blog. As I read through this article on personal health information, I can’t help but think back to your column early in the new year on the topic of electronic health records.
    As I commented then, this challenge with information is not just limited to personal health information but all of our information on the Internet. The challenge is that each database of information includes both WHO I Am™ and WHAT I Am™, so when a breach occurs, everything is stolen. If WHO and WHAT were in different places and connected and controlled by the user, then this is no longer “an accident waiting to happen.”
    The Express Scripts breach from a year ago is a very visible example of the consequences of this privacy and security problem (http://www.cloudinc.org/cloud-health/express-scripts-data-breach-isolated-incident-on-endemic-problem/#more-135). With over 700,000 records stolen from ExpressScripts, a single breach can be a huge problem.
    As I mentioned in our post about ExpressScripts, though, this is as much an architectural issue as it is a security problem for individual companies. If we had a standard on the Internet for people and information and not just web pages, then we could architect privacy and security into the Internet, as opposed to depending on the various investment decisions of individual companies.
    CLOUD, Inc. (Consortium for Local Ownership and Use of Data) is building this new architecture for the Internet. CTML (context markup language) goes far beyond identity, to empower Internet users to control precisely how their information is used. Think of it as privacy and authenticity standards that work — not a confusing Web-based control panel, but standards to let anyone — user or service provider — develop tools that are simultaneously more sophisticated and easier to use. Doing this requires a shift in thinking equivalent to what HTML brought to the Internet 15 years ago: this time, though it is a mark-up language not for text, but for people. It’s a mark-up language that supports Internet connections that transcend the browser paradigm that’s consumed us since the 90s.
    CLOUD believes the keys to adoption are, one, Local Ownership and Use of Data — hence the name of our Consortium. Two: to break down health, finance, education, and other silos to simply connect people, not industries. Just as people use one main standard to connect text on the Internet, we should use one main standard to connect with each other on the Internet. And three: empowering people to separate their identity from their data in every silo. Thus, privacy is ensured and the economic value of connections among people and their data grows. It’s ME 1.0, not just Web 2.0.
    Keep up the great work at Health Care Advisors, Health 2.0 and the Health Care Blog. We look forward to showing off a prototype of CLOUD and CTML at Health 2.0 in San Francisco in October!

  3. Is there anything in our lives that is adequately secure? Airplanes, cars, banking information, the stock market, our tax information? I don’t know ; our whole connected world enterprise seems pretty porous to me.

  4. Add that to the intentional leaks and intentional inaccuracies from our government and medical “professionals” are already at a high, especially for people with disabilities. But why do we care – the most hated group ever in the history of the world? It could only mean the difference between life and death, employed or not employed for the 30% of the population that have been labeled “disabled”. Let’s just shove it under the rug, like we do with all of the rest of questions about this “health” care legislation. We make money when we screw up somebody elses life – and enjoy protection from the government for doing it.

  5. The best place for your private information is locked in your doctor’s office. If any one wants to know, they can call me, and tell me why they need the information, and what they want, exactly, after you give me permission. Then, I will decide what is appropriate to give out, recognizing that I am your advocate, and will protect you.

  6. Short answer – NO. Opening Healthcare to the IRS … what could go wrong?

  7. Freedom from pain, illness and disease! Master Herbalist recommended, safe and effective. Achieve a healthier America through self help. Taking control of your health care is as easy as 1,2,3.

  8. Great article as usual, Brian and David.
    While security breaches need to be addressed, I recently looked at this subject from a slightly different perspective – Privacy.
    We are embarked on an effort to make data “fluid” and it seems that all those small practices out there are going to be required to store, or at least upload, their data to various aggregators, whether these are EHR vendors, HIE vendors, RHIOs, platform owners or ultimately government.
    The question begged to be asked is what happens to all that data? What can those aggregators do with it? What will they do with it? What control if any do we have?
    Identifiable or deidentified patient data is worth a lot of money. Deidentification offers no real protection against reidentification.
    Some companies are counting on monetizing these mountains of private information, for others the temptation will be enormous. This is not the sort of genie that can be pushed back into the bottle….
    And it’s not just financial data:
    “Electronic Medical Records can contain information on disease, medications, treatments, social habits, drinking habits, smoking status, sexual activity and orientation, abuse, depression, mental health, financial class, ethnicity, education, family circumstances, diet and exercise, residence, SSN, employment, travel, hobbies and whatever else providers choose to ask and we choose to answer.”
    The entire article is at http://onhealthtech.blogspot.com/2010/04/ehr-whose-record-is-it-anyway.html

  9. Excellent post. I especially appreciated your insight that security is a process not a product. I take your point to be in part that we need more than just a technological solution to the security gaps in electronic health records. Instead, we need people to change fundamental behaviors that take into account life in an increasingly digital world. It strikes me that your insight is especially good because not only does a technological patch not address the problem of electronic health record security, but it in fact may exacerbate it by allowing bad behaviors to persist under the illusion of increased security, or in some cases even deepening the bad habits that lead to such security gaps. I might further speculate that our failure to grasp this is perhaps a product of being out of touch with how the technologies function in the first place. We live in a world where we reap all the benefits of various technological gadgets, but relatively few have any insight into how they are put together and function. Accordingly, our imaginations are not even primed to imagine solutions within our own power because there is a perceived divide that exists between the users of a technology and the creators of a technology.

  10. Thanks to Brian and David for highlighting this issue. The post and the fine Kroll study focus on an important area – security measures related to confidentiality in large healthcare enterprises (e.g. hospitals, AMCs, VA).
    But, one should not leave this topic without also visiting the issue from the perspective of the organizations where a great percentage of all PHI resides- small medical practices. Most practices are small practices; so, this is a large segment of the healthcare institution population. The opportunities, challenges, risks, benefits, politics, and public relations related to security of PHI in small medical practices are distinct enough, in my opinion, that they deserve their own treatment.
    Generally, each small practice has less data today than a hospital, but still has data on thousands of patients. Small practices don’t have the level of administrative support that larger institutions have to focus on this area. While much of the PHI in small practices is on paper records now, we can reasonably expect much more of it to become e-PHI in the next few years. When this happens, we will have a generation of these small practices newly involved in managing this set of e-PHI risks that, as a group, they don’t have much experience managing. Confidentiality risks are an important subset of these risks, but risks to data integrity and availability are also present and need to be managed in order for small practices to succeed in using e-PHI.
    Doing well with security in small practices will be a great challenge. To the extent that we are not successful in meeting this challenge, the viability of many practices will be threatened and the ability to gain value from e-PHI will be threatened.

  11. At least in America you’re having the debate and there’s a reasonable chance the various parties will adhere to the course of actions decided upon…
    unfortunately, here in the UK it was discussed; recommendations made and then various bodies ignored all the recommendations – http://www.telegraph.co.uk/health/healthnews/7552827/Security-fears-as-NHS-sends-patient-records-to-India.html
    “The possible risks of transferring patient data abroad were exposed last year when undercover reporters from ITV’s Tonight programme were able to buy health records which were processed in India from a private hospital in London.”
    I hope people in America can learn from our mistakes.

  12. Perhaps this isn’t an issue that men think about very often but with over 1 million abortions a year – many women are less concerned about the “theft” of their data then by the forced disclosure to all of their health care providers that they may have had this procedure..
    If for example you are one of the 8 million Kaiser members and you have even a therapeutic abortion – ie the baby died in utero – everyone from your eye doctor to your foot doctor will see it listed under procedures..
    In the same way that not all pharmacists are pro choice not all health care workers are and the threat of this forced self disclosure on women will have a huge impact once the various systems start to be linked.
    What for example will happen if a Catholic who goes to a private provider has this procedure down and then subsequently delivers another child at a Catholic hospital? Should she be concerned that it it will be disclosed to the staff there as part of her permanent health record? Does the health benefits outweigh the risks? What will happen once nurses and other staff who work in these clinics are also “outed”?

  13. “Perhaps the HIMSS Analytics’ study’s most important and penetrating finding is that “health care organizations continue to think of data security in specific silos (IT, employees, etc.) and not as an organization-wide responsibility, which creates unwanted gaps in policies and procedures.”
    This is what is known as the pot calling the kettle black. HIMSS is trying to protect itself from the charges of conflict and promoting unsafe HIT devices.
    The breaches in privacy are widespread. Hospitals go to the extreme of using HIPAA to fire aggressive patient safety oriented disruptive (to the unsafe conditions) doctors who complain about the environmental dangers (such as monitor alarms that are ignored) after they are “caught” looking at their own medical record (sic).