The Internet is abuzz criticizing Anthem for not encrypting its patient records. Anthem has been hacked, for those not paying attention.
Anthem was right, and the Internet is wrong. Or at least, Anthem should be “presumed innocent” on the issue. More importantly, by creating buzz around this issue, reporters are missing the real story: that multinational hacking forces are targeting large healthcare institutions.
Most lay people, clinicians and apparently, reporters, simply do not understand when encryption is helpful. They presume that encrypted records are always more secure than unencrypted records, which is simplistic and untrue.
Encryption is a mechanism that ensures that data is useless without a key, much in the same way that your car is made useless without a car key. Given this analogy, what has apparently happened to Anthem is the security equivalent to a car-jacking.
When someone uses a gun to threaten a person into handing over both the car and the car keys needed to make that care useless, no one says “well that car manufacturer needs to invest in more secure keys”.
In general, systems that rely on keys to protect assets are useless once the bad guy gets ahold of the keys. Apparently, whoever hacked Anthem was able to crack the system open enough to gain “programmer access”. Without knowing precisely what that means, it is fair to assume that even in a given system implementing “encryption-at-rest”, the programmers have the keys. Typically it is the programmer that hands out the keys.
Most of the time, hackers seek to “go around” encryption. Suggesting that we use more encryption or suggesting that we should use it differently is only useful when “going around it” is not simple. In this case, that is what happened.
Continue reading “Anthem Was Right Not to Encrypt”
Filed Under: Tech, THCB
Tagged: Anthem, Encryption, Fred Trotter, Hacking, HIPAA, Privacy, State sponsored crime
Feb 9, 2015
In the future, everything will be connected.
That future is almost here.
Over a year ago, the Federal Trade Commission held an Internet of Thingsworkshop and it has finally issued a report summarizing comments and recommendations that came out of that conclave.
As in the case of the HITECH Act’s attempt to increase public confidence in electronic health records by ramping up privacy and security protections for health data, the IoT report — and an accompanying publication with recommendations to industry regarding taking a risk-based approach to development, adhering to industry best practices (encryption, authentication, etc.) — seeks to increase the public’s confidence, but is doing it the FTC way: no actual rules, just guidance that can be used later by the FTC in enforcement cases. The FTC can take action against an entity that engages in unfair or deceptive business practices, but such practices are defined by case law (administrative and judicial), not regulations, thus creating the U.S. Supreme Court and pornography conundrum — I can’t define it, but I know it when I see it (see Justice Stewart’s timeless concurring opinion in Jacobellis v. Ohio).
To anyone actively involved in data privacy and security, the recommendations seem frighteningly basic:
-build security into devices at the outset, rather than as an afterthought in the design process;
- train employees about the importance of security, and ensure that security is managed at an appropriate level in the organization;
- ensure that when outside service providers are hired, that those providers are capable of maintaining reasonable security, and provide reasonable oversight of the providers;
- when a security risk is identified, consider a “defense-in-depth” strategy whereby multiple layers of security may be used to defend against a particular risk;
-consider measures to keep unauthorized users from accessing a consumer’s device, data, or personal information stored on the network;
-monitor connected devices throughout their expected life cycle, and where feasible, provide security patches to cover known risks.
-consider data minimization – that is, limiting the collection of consumer data, and retaining that information only for a set period of time, and not indefinitely;
- notify consumers and give them choices about how their information will be used, particularly when the data collection is beyond consumers’ reasonable expectations.
Continue reading “Privacy and Security and the Internet of Things”
Filed Under: Tech, THCB
Tagged: Data minimization, HIPAA, HITECH, Internet of Things, Privacy, Security
Feb 3, 2015
Over the last five years, the United States has undergone more significant changes to its health care system perhaps since Medicare and Medicaid were introduced in the 1960s. The Health Information Technology for Economic and Clinical Health (HITECH) Act in 2009 and the Patient Protection and Affordable Care Act of 2010 have paved the way for tremendous changes to our system’s information backbone and aim to provide more Americans access to health care.
But one often-overlooked segment of our health care system has been letting us down. Patients’ access to their own medical information remains limited. The HIPAA Privacy Rule grants individuals the right to copies of their own medical records, but it comes at a noteworthy cost—health care providers are allowed to charge patients a fee for each record request. As explained on the Department of Health and Human Services’ website, “the Privacy Rule permits the covered entity to impose reasonable, cost-based fees.”
HIPAA is a federal regulation, so the states have each imposed guidelines outlining their own interpretations of “reasonable.” Ideally, the price of a record request would remain relatively constant—after all, the cost of producing these records does not differ significantly from state to state. But in reality, the cost of requesting one’s medical record is not only unreasonably expensive; it is also inconsistent, costing dramatically different amounts based on local regulation. Continue reading “An Open Letter to the People Who Brought Us HIPAA”
Filed Under: Tech, THCB
Tagged: HIPAA, HIPAA Privacy Rule, Medical record, Privacy
Jan 13, 2015
Of the nearly 100 people I interviewed for my upcoming book, John Halmaka was one of the most fascinating. Halamka is CIO of Beth Israel Deaconess Medical Center and a national leader in health IT policy. He also runs a family farm, on which he raises ducks, alpacas and llamas. His penchant for black mock turtlenecks, along with his brilliance and quirkiness, raise inevitable comparisons to Steve Jobs. I interviewed him in Boston on August 12, 2014.
Our conversation was very wide ranging, but I was particularly struck by what Halamka had to say about federal privacy regulations and HIPAA, and their impact on his job as CIO. Let’s start with that.
Halamka: Not long ago, one of our physicians went into an Apple store and bought a laptop. He returned to his office, plugged it in, and synched his e-mail. He then left for a meeting. When he came back, the laptop was gone. We looked at the video footage and saw that a known felon had entered the building, grabbed the laptop, and fled. We found him, and he was arrested.
Now, what is the likelihood that this drug fiend stole the device because he had identity theft in mind? That would be zero. But the case has now exceeded $500,000 in legal fees, forensic work, and investigations. We are close to signing a settlement agreement where we basically say, “It wasn’t our fault but here’s a set of actions Beth Israel will put in place so that no doctor is ever allowed again to bring a device into our environment and download patient data to it.”
Continue reading “Black Turtlenecks, Data Fiends and Code. An Interview with John Halamka”
Filed Under: Tech, THCB
Tagged: Apple Store, Clinical Informatics, EMRs, HIPAA, John Halamka, Privacy
Jan 5, 2015
Long time (well very long time) readers of THCB will remember my extreme frustration with Patients Privacy Rights founder Deborah Peel who as far as I can tell spent the entire 2000s opposing electronic health data in general and commercial EMR vendors in particular. I even wrote a very critical piece about her and the people from the World Privacy Forum who I felt were fellow travelers back in 2008. And perhaps nothing annoyed me more than her consistently claiming that data exchange was illegal and that vendors were selling personally identified health data for marketing and related purposes to non-covered entities (which is illegal under HIPAA).
However, in recent years Deborah has teamed up with Adrian Gropper, whom I respect and seemed to change her tune from “all electronic data violates privacy and is therefore bad”, to “we can do health data in a way that safeguards privacy but achieves the efficiencies of care improvement via electronic data exchange”. But she never really came clean on all those claims about vendors selling personally identified health data, and in a semi-related thread on THCB last week, it all came back. Including some outrageous statements on the extent of, value of, and implications of selling personally identified health data. So I’ve decided to move all the relevant comments to this blog post and let the disagreement continue.
What started the conversation was a throwaway paragraph at the end of a comment I left in which I basically told Adrian to rewrite what he was saying in such a way that normal people could understand it. Here’s my last paragraph
As it is, this is not a helpful open letter, and it makes a bunch of aggressive claims against mostly teeny vendors who have historically been on the patients’ side in terms of accessing data. So Adrian, Deborah & PPR need to do a lot better. Or else they risk being excluded back to the fringes like they were in the days when Deborah & her allies at the World Privacy Forum were making ridiculous statements about the concept of data exchange.
Here’s Deborah’s first comment Continue reading “Is Deborah Peel up to her old tricks?”
Filed Under: THCB
Tagged: Deborah Peel, HIPAA, Matthew Holt, patient data, Patient Privacy Rights, Privacy
Nov 23, 2014
This story was co-published with NPR’s “Shots” blog.
In the name of patient privacy, a security guard at a hospital in Springfield, Missouri, threatened a mother with jail for trying to take a photograph of her own son. In the name of patient privacy , a Daytona Beach, Florida, nursing home said it couldn’t cooperate with police investigating allegations of a possible rape against one of its residents.
In the name of patient privacy, the U.S. Department of Veterans Affairs allegedly threatened or retaliated against employees who were trying to blow the whistle on agency wrongdoing.When the federal Health Insurance Portability and Accountability Act passed in 1996, its laudable provisions included preventing patients’ medical information from being shared without their consent and other important privacy assurances.But as the litany of recent examples show, HIPAA, as the law is commonly known, is open to misinterpretation – and sometimes provides cover for health institutions that are protecting their own interests, not patients’.
“Sometimes it’s really hard to tell whether people are just genuinely confused or misinformed, or whether they’re intentionally obfuscating,” said Deven McGraw, partner in the healthcare practice of Manatt, Phelps & Phillips and former director of the Health Privacy Project at the Center for Democracy & Technology.For example, McGraw said, a frequent health privacy complaint to the U.S. Department of Health and Human Services Office of Civil Rights is that health providers have denied patients access to their medical records, citing HIPAA. In fact, this is one of the law’s signature guarantees.”Often they’re told [by hospitals that] HIPAA doesn’t allow you to have your records, when the exact opposite is true,” McGraw said.
I’ve seen firsthand how HIPAA can be incorrectly invoked.
In 2005, when I was a reporter at the Los Angeles Times, I was asked to help cover a train derailment in Glendale, California, by trying to talk to injured patients at local hospitals. Some hospitals refused to help arrange any interviews, citing federal patient privacy laws. Other hospitals were far more accommodating, offering to contact patients and ask if they were willing to talk to a reporter. Some did. It seemed to me that the hospitals that cited HIPAA simply didn’t want to ask patients for permission.
Continue reading “Are Patient Privacy Laws Being Abused to Protect Medical Centers?”
Filed Under: OP-ED, THCB
Tagged: Charles Ornstein, Deven McGraw, HIPAA, Hospitals, LA Times, patient information, Privacy, VA
Jul 24, 2014
By now, most of you have probably heard—perhaps via your Facebook feed itself—that for one week in January of 2012, Facebook altered the algorithms it uses to determine which status updates appeared in the News Feed of 689,003 randomly-selected users (about 1 of every 2500 Facebook users). The results of this study—conducted by Adam Kramer of Facebook, Jamie Guillory of the University of California, San Francisco, and Jeffrey Hancock of Cornell—were just published in the Proceedings of the National Academy of Sciences (PNAS).
Although some have defended the study, most have criticized it as unethical, primarily because the closest that these 689,003 users came to giving voluntary, informed consent to participate was when they—and the rest of us—created a Facebook account and thereby agreed to Facebook’s Data Use Policy, which in its current iteration warns users that Facebook “may use the information we receive about you . . . for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”
Some of the discussion has reflected quite a bit of misunderstanding about the applicability of federal research regulations and IRB review to various kinds of actors, about when informed consent is and isn’t required under those regulations, and about what the study itself entailed.
Continue reading “Why We May Be Making Too Much of the Facebook Experiment”
Filed Under: THCB
Tagged: Cornell, Emoticons, Facebook, Human Subjects Research, Internal Review Board, Privacy, Proceedings of the National Academy of Sciences, UCSF
Jul 1, 2014
At the first White House public workshop on Big Data, Latanya Sweeney, a leading privacy researcher at Carnegie Mellon and Harvard who is now the chief technologist for the Federal Trade Commission, was quoted as asking about privacy and big data, “computer science got us into this mess; can computer science get us out of it?”
There is a lot computer science and other technology can do to help consumers in this area. Some examples:
• The same predictive analytics and machine learning used to understand and manage preferences for products or content and improve user experience can be applied to privacy preferences. This would take some of the burden off individuals to manage their privacy preferences actively and enable providers to adjust disclosures and consent for differing contexts that raise different privacy sensitivities.
Computer science has done a lot to improve user interfaces and user experience by making them context-sensitive, and the same can be done to improve users’ privacy experience.
• Tagging and tracking privacy metadata would strengthen accountability by making it easier to ensure that use, retention, and sharing of data is consistent with expectations when the data was first provided.
• Developing features and platforms that enable consumers to see what data is collected about them, employ visualizations to increase interpretability of data, and make data about consumers more available to them in ways that will allow consumers to get more of the benefit of data that they themselves generate would provide much more dynamic and meaningful transparency than static privacy policies that few consumers read and only experts can interpret usefully.
In a recent speech to MIT’s industrial partners, I presented examples of research on privacy-protecting technologies.
Continue reading “Using Technology to Better Inform Consumers about Privacy Decisions”
Filed Under: Uncategorized
Tagged: Apps, Big Data, Brookings Institution, Cameron Kerry, Consumer Protection, Design, Privacy
Apr 30, 2014
T was never a star service tech at the auto dealership where he worked for more than a decade. If you lined up all the techs, he wouldn’t stand out: medium height, late-middle age, pudgy, he was as middle-of-the-pack as a guy could get.
He was exactly the type of employee that his employer’s wellness vendor said was their ideal customer. They could fix him.
A genial sort, T thought nothing of sitting with a “health coach” to have his blood pressure and blood taken, get weighed, and then use the coach’s notebook computer to answer, for the first time in his life, a health risk appraisal.
He found many of the questions oddly personal: how much did he drink, how often did he have (unprotected) sex, did he use sleeping pills or pain relievers, was he depressed, did he have many friends, did he drive faster than the speed limit? But, not wanting to rock the boat, and anxious to the $100/month bonus that came with being in the wellness program, he coughed up this personal information.
The feedback T got, in the form of a letter sent to both his home and his company mailbox, was that he should lose weight, lower his cholesterol and blood pressure, and keep an eye on his blood sugar. Then, came the perfect storm that T never saw developing.
His dealership started cutting employees a month later. In the blink of an eye, a decade of service ended with a “thanks, it’s been nice to know you” letter and a few months of severance.
T found the timing of dismissal to be strangely coincidental with the incentivized disclosure of his health information.
Continue reading “What If Your Employer Gets Access to Your Medical Records?”
Filed Under: THCB, The Vault
Tagged: Al Lewis, data breaches, Employers, personal health records, Privacy, Vik Khanna, Wellness, workplace wellness programs
Mar 25, 2014
The field of analytics has fallen into a few big holes lately that represent both its promise and its peril. These holes pertain to privacy, policy, and predictions.
Policy. 2.2/7. The biggest analytics project in recent history is the $6 billion federal investment in the health exchanges. The goals of the health exchanges are to enroll people in the health insurance plans of their choice, determine insurance subsidies for individuals, and inform insurance companies so that they could issue policies and bills.
The project touches on all the requisites of analytics including big data collection, multiple sources, integration, embedded algorithms, real time reporting, and state of the art software and hardware. As everyone knows, the implementation was a terrible failure.
The CBO’s conservative estimate was that 7 million individuals would enroll in the exchanges. Only 2.2 million did so by the end of 2013. (This does not include Medicaid enrollment which had its own projections.) The big federal vendor, CGI, is being blamed for the mess.
Note that CGI was also the vendor for the Commonwealth of Massachusetts which had the worst performance of all states in meeting enrollment numbers despite its long head start as the Romney reform state and its groundbreaking exchange called the Connector. New analytics vendors, including Accenture and Optum, have been brought in for the rescue.
Was it really a result of bad software, hardware, and coding? Was it that the design to enroll and determine subsidies had “complexity built-in” because of the legislation that cobbled together existing cumbersome systems, e.g. private health insurance systems? Was it because of the incessant politics of repeal that distracted policy implementation? Yes, all of the above.
The big “hole”, in my view, was the lack of communications between the policy makers (the business) and the technology people. The technologists complained that the business could not make decisions and provide clear guidance. The business expected the technology companies to know all about the complicated analytics and get the job done, on time.
This ensuing rift where each group did not know how to talk with the other is recognized as a critical failure point. In fact, those who are stepping into the rescue role have emphasized that there will be management status checks daily “at 9 AM and 5 PM” to bring people together, know the plan, manage the project, stay focused, and solve problems.
Walking around the hole will require a better understanding as to why the business and the technology folks do not communicate well and to recognize that soft people skills can avert hard technical catastrophes.
Continue reading “Very Big Data”
Filed Under: THCB
Tagged: analytics, Big Data, CGI, Dwight McNeill, Healthcare.gov, Predictive analytics, Privacy, Target
Mar 19, 2014