Health reform activists and privacy mavens have been at loggerheads for years. Those touting health reform complain that an oversensitivity to privacy risks would hold back progress in treatments. Running in parallel but in the opposite direction, the privacy side argues that current policies are endangering patients and that the current rush to electronic records and health information exchange can make things worse.
It’s time to get past these arguments and find a common ground on which to institute policies that benefit patients. Luckily, the moment is here where we can do so. The common concern these two camps have for giving patients power and control can drive technological and policy solutions.
Deborah Peel, a psychiatrist who founded Patient Privacy Rights, has been excoriated by data use advocates for ill-considered claims and statements in the past. But her engagement with technology experts has grown over the years, and given the appointment of a Chief Technology Officer, Adrian Gropper, who is a leading blogger on this site, PPR is making real contributions to the discussion of appropriate technologies.
PPR has also held three Health Privacy Summits in Washington, DC, at the Georgetown Law Center, just a few blocks from the Capitol building. Although Congressional aides haven’t found their way to these conferences as we hoped (I am on the conference’s planning committee), they do draw a wide range of state and federal administrators along with technologists, lawyers, academics, patient advocates, and health care industry analysts. The most recent summit, held on June 5 and 6, found some ways to move forward on the data sharing vs. privacy stand-off in such areas as patient repositories, consent, anonymization, and data segmentation. It also highlighted how difficult these tasks are.
What do patients really think?
There is no hiding from patient concerns anymore. Everyone in health care reform is for the patient: she needs to be a partner with the clinicians, needs to be empowered and to take charge, and–echoing the “Give me my damn data” slogan invented by e-Patient Dave–needs to have access to all the data collected on her by clinicians. So what do these patients we all love (and become ourselves) think of other entities getting their data?
Researchers and reformers tend to believe, the more sharing the better. I actually heard a prominent patient advocate say at a conference in 2011, “No patient will say ‘please don’t use my data’ if it will help me or help someone else in my position,” although he backed away from the inflexibility of this statement when I questioned him about it. In contrast to this blithe trust, let’s review some findings from the health privacy summit:
- Kathryn Serkes, founder of the Doctor Patient Medical Association, cited the disturbing results of a survey on privacy and trust. Most doctors have been asked by a patient to lie in their record, and 2 out of 3 doctors admit having done so. (Probably even more do it and won’t admit to it.) Apparently, patients and their providers alike fear what can happen to data “downstream.” Parenthetically, this also means that many patients records are incomplete and incorrect, which may hurt them later. In addition, 67% of doctors think electronic records will harm patient privacy, while only 8% say it will improve patient privacy.
- Kelly Caine cited patients from one of her studies, claiming they would be willing to share data with researchers, but only if asked first. There may be a new way to avoid the overhead this would require: Portable Legal Consent, a way of letting patients offer their data to classes of future researchers. I covered Portable Legal Consent in a report from a previous conference, and it has come up approvingly at the health privacy summits.
- At the Health Data Forum (Health Datapalooza) that took place just before the privacy summit, a provider whose facility serves a lot of poor and Medicaid patients told me their patients perennially give them wrong data, including wrong names and addresses, making it hard to reach patients later with information that can save their health. Apparently, the patients associate the clinic with the government and want the government to have as little information on them as they can get away with. If this seems paranoid–well, let me just remind you of the many scandals about US government spying that have been popping up around the time of the conferences.
In a cynical dismissal of public understanding that has been widely quoted, security expert Bruce Schneier said in a 2001 interview: “If McDonalds in the United States would give away a free hamburger for an DNA sample they would be handing out free lunches around the clock.” This might have been true in 2001 before the public knew the implications of their DNA. Now they do. Health privacy remains an ongoing concern.
Can we control our own data?
Patients don’t know where all their data is leaking out to, but the ones who try to keep data out of their records are instinctively guarding against real threats. As a few examples:
- Right before the health privacy summit, one of its annual speakers, Latanya Sweeney, released information on the re-identification of patients in Washington state who were supposedly de-identified in public health data being sold by that state. Sweeney, who is famous for uncovering the health information of Massachusetts Governor William Weld, performed similar combinations of health data with public databases and news reports to re-identify over 60 people in the state of Washington. About 40% of people whose hospitalizations were reported in news articles could be re-identified through data combinations, and even more were re-identified with a little more searching on the Internet.
- A patient spoke at the summit about the awful shock of finding her name, address, diagnosis, and treatment information on a gaming site, where apparently it had been inadvertently uploaded (along with the data of 14 other patients) by a careless researcher at NIH. The information came from an NIH study in which she had participated, but NIH closed its investigation without telling her anything about how the breach happened, and even denied her FOIA request.
- At the 2012 summit, another patient reported how sensitive information on her mental health and her experience as an abused child, which she told a doctor within the enormous Boston-area Partners HealthCare network, was disseminated throughout the network, leading to doctors making unprofessional and unwarranted deductions during routine care. As consolidation (encouraged through such projects as Accountable Care Organizations) brings more and more patients into such large bureaucracies, this patient’s experience should concern us all.
De-identified data underlies all clinical and pharmaceutical research. The many US states that sell health care data, like Washington, don’t de-identify adequately. And some privacy advocates doubt that any de-identification can work over the long term. Certainly, all current data sets are in jeopardy, given trends in the collection of everyday purchase and personal data and continual improvements in statistical methods.
But there are (a relatively few) data experts who know the considerations and techniques for keeping risk within reasonable bounds while de-identifying health data. I will indulge here in a bit of promotion, mentioning a new O’Reilly book called Anonymizing Health Data by Khaled El Emam and Luk Arbuckle.
Although Sweeney has repeatedly criticized organizations that de-identify data and has proven their efforts to be inadequate, she expressed a hopeful attitude toward de-identification at the health privacy summit. She pointed out that encryption, which everyone depends on, has improved over the decades as new techniques get introduced and broken, leading to yet more techniques. De-identification, she said, will go through the same iterative evolution.
Both identified and de-identified patient data are out of control. Sweeney’s web site, the Data Map, keeps track of the various places data goes. And it does not even account for breaches or other illegal access.
Resolving the dispute
How can we lead health care research into the modern age of big data while respecting patients’ worries over misuse of their data? Looking at the difficulties, as the health care summits have done, is the start. But the summits have also been tacking toward solutions.
A forum on data segmentation presented services and pilot programs that can allow patients to release certain parts of their data sets (for instance, information about cancer diagnoses and treatments) while restricting others (such as illegal drug use or mental health problems). Lively debates took place about the dangers patients could put themselves in by withholding information from providers. But as we’ve seen, this happens now already.
Adrian Gropper looked at cloud computing from the viewpoint of principles that PPR has laid out for protecting patient data, and found that the cloud storage is quite promising as aid to privacy. For instance, cloud providers guard against breaches much better than most health care institutions, and centralized storage allows better auditing.
Discussions took place about updating HIPAA, as well as enforcing the provisions that have been in it from the start for allowing patients to get access to their data. Leon Rodriguez, Director of the Office for Civil Rights in Department of Health and Human Services, promised that his office was aggressively disciplining health providers when patients report problems getting data.
A far-ranging conversation
The third summit was, in my opinion, the best yet. I lack the space to share all the eye-opening conversations I heard, and heartily recommend the summit to anyone who cares about health care or personal autonomy.
The conference heard from Peter Hustinx, the European Data Protection Supervisor, who pointed out that the idea of notifying victims when breaches have released their information is a US invention, but is more widely respected in Europe. We also heard from US CTO Todd Park, who reiterated his frequent endorsement of patient control over their data. The annual privacy awards were given to Hustinx and to Mark Rothstein, Professor at the University of Louisville School of Medicine. And PPR publicized a petition to protect data submitted to health information exchanges.
Andy Oram is an editor at O’Reilly Media, a highly respected book publisher and technology information provider. His work for O’Reilly includes the influential 2001 title Peer-to-Peer, the 2005 ground-breaking book Running Linux, and the 2007 best-seller Beautiful Code.