By JOEL KUPERSMITH, MD
As shown by breaches of personal information on innumerable individuals over the years, our approach to IT security falls short. Recent intrusions at Sony Pictures Entertainment and Anthem Health (80 million individuals) against a backdrop of substantial losses of personal health (PHI) and other IT information previously again brought this deficiency again to public attention. According to one estimate, almost 1 billion records were stolen via 1500 breaches in 2014, a 78% increase from the previous year and a clear indication of an increasing problem. Among personal information, health records are particular targets, bringing in $20 per record versus $1-$2 for a credit card and surveys consistently show considerable public concern about the privacy of PHI.
In a recent commentary, David Brailer proposed that raised security standards for health information be one of four principles underlying new privacy legislation. I strongly agree and would add a specific step to apply this principle – privacy accreditation for health data custodians.
Whether the information is stored for care, insurance or research, the public lacks understanding of the complexity of their stored PHI and the large number of individuals with access to or custodial responsibility for it. There is thus a wide gap and power differential between data providers and those who hold enormous amounts of sensitive health data. This circumstance creates a need for an empowered intermediary to act on the public’s behalf, i.e. an accreditation body.
I would advocate for a new IT health privacy accreditation body. It should be a non-profit entity, jump-started by legislation and funded via fees buttressed by a congressional appropriation with a three year sunset. It would evaluate data security measures comprehensively, in particular technical and personnel matters, including data-sharing procedures, encryption or equivalent, etc. It would then confer accreditation and as such formally interpret, maintain, apply, enforce and in certain cases set privacy standards. It would have similar processes as analogous entities, such as The Joint Commission and should be adaptable to the many and constantly changing technical and procedural details involved with securing data in a shifting terrain.
Accreditation would apply to hospitals, insurance companies, health plans, research centers and others who hold at least a certain number of health records (to be determined). The accreditation body would conduct periodic announced and unannounced site-visits and audits with graded outcomes and there would be an appeals process. To give the body teeth and similar to other entities, its accreditation should be necessary for federal funding (Medicare, NIH). Conflicts of interest within the body would be addressed by policies and by a balance of competing interests including a spectrum of relevant stakeholders (corporations, patients, healthcare professionals, researchers, privacy experts, etc.) in its Board of Directors.
At present corporate responsibility primarily governs IT security. The Office of Civil Rights provides federal enforcement and penalties via responding to complaints and state governments also play a role. However, these entities do not act as accrediting bodies. Making privacy more a part of other accreditation reviews would not provide a sufficient concentration of expertise focused on the complexities of IT security and certification in specific areas does not address the overall problem.
Perhaps the major concern for a new accreditation process is that it would saddle healthcare entities with yet another bureaucratic step and still more site visits, audits and reviews. It would likely cause dismay and considerable (appropriate) discussion. The healthcare system is burdened enough though an additional, detailed process seems necessary to meaningfully upgrade IT security.
Also, no audit can guarantee perfect and complete security. A favorable audit could be followed by a breach. But the process, with mechanisms for self-improvement, would make such breaches far less likely. While technology can change very quickly (including between audits), accreditation reviews would determine if the data custodian has the personnel and technical capacity to keep abreast of and deal with rapid changes. Warning signs preceded the large loss at Target and a smaller breach of personal information preceded the later Anthem loss. Accreditation reviews would have noted both occurrences.
In conclusion, the privacy of health information has been considered a personal right since Hippocrates. Despite surveys showing strong concern about health privacy in the general population, our culture may or may not still be serious about its maintenance. If it is, preserving privacy will not come easily. Privacy accreditation of healthcare data custodians seems an achievable way to address this monumental and labyrinthine problem.
Joel Kuppersmith led research efforts at the VA as Chief Development and Research Officer. He is an Adjunct Professor of Medicine at Georgetown.
Categories: Uncategorized
While technological advances may underlie many IT intrusions in industry and defense, 80 % of breaches in HIT (affecting 29.1 individuals in 2010-2013) are due to what has been called poor “data hygiene”. Lapses include neglect to implement basic precautions such as sound practices for authenticating users, encrypting healthcare data and prohibiting storage of IT information on employees’ personal electronic devices. An accreditation process is an excellent approach to identify lapses of this nature.
Also, Texas has undertaken privacy and security certification in a voluntary program (SECURETexas) with a similar thought but a different scope than what I suggested. The program applies to covered entities holding PHI and assesses compliance with state and federal laws. Created by a Texas law of 2011, it is managed by the state with industry collaboration.
Well, I’m glad you are prioritizing this problem ‘way up high’. Your idea sounds bureaucratic and costly, but it might work.
But what if we just did something simpler, like using something like Secure Shell protocol, SSH, with cryptography, and added some image handling ability AND then added a criminal offense to “willfully and knowingly or passively allowing PHI to be sold, to be stolen, or to be used for any purposes other than those allowed and permitted by the owner-patient.”
Something like this?
I’m thinking that if you try to regulate and certify the process, the hacking technology will get ahead of you, surely, and you will always be responding, not proactive. If you penalize the outcome, i.e., the leakage of PHI, then the users will always be exerting maximum ingenuity….and will hopefully get ahead of the hackers.
But, if this problem is not solved, forget EHRs–at least ones that are in LANs.
Where, or in what sector of the economy, has your idea been used? And was it successful?