2015 was the year health care got serious about cyber security.
Hackers gave the industry no other choice.
The year started with a massive data breach at Indianapolis-based Anthem Inc., which the health insurer revealed on Feb. 4. Hackers roamed around in Anthem’s computers for six weeks and stole personal and financial information of 78.8 million customers, as well as the information of 8.8 million customers at Blue Cross and Blue Shield plans not owned by Anthem.
There have been 269 data breaches at health care organizations this year, according to statistics collected through Dec. 22 by the Identity Theft Resource Center. That’s actually down from 2014, when health care organizations suffered 333 breaches.
But the number of records stolen has soared to 121.6 million records stolen, up from less than 8.4 million records in 2014. Even without the Anthem breach, there were still 34 million records stolen this year from health organizations.
The health care industry accounted for one out of every three breaches recorded by the Identity Theft Resource Center.
“They can and are trying to break into everything,” Doug Leonard, president of the Indiana Hospital Association, said of hackers. He added, “It’s really on everybody’s radar screen in the health care industry.”
In a survey released in August by consulting firm KPMG, 81 percent of health care executives said their organization had suffered a cyber attack in the previous two years and 13 percent said they were being attacked daily.
Recently, I took a bunch of heat for writing that Anthem was right not to encrypt. My point was that the application encryption is just one of several security measures that add up to a security posture, and that we needed to wait until we got more information before condemning Anthem for a poor security posture.
A security posture is the combination of an organization’s overall security philosophy as well as the specific security steps that the organization takes as a result of that philosophy. Basically the type of posture taken shows whether an organization takes security and privacy seriously, or prefers a “window dressing” approach. I argued that simply knowing that the database in question did not have encryption was not enough detail to assess the Anthem security posture.
Well we have more evidence now, and its not looking good for Anthem.
His argument that encryption wasn’t to blame for the largest healthcare data breach in U.S. history is technically correct, but lost in that technical argument is the fact that healthcare organizations are notably lax in their overall security profile. I found this out firsthand last year when I logged onto the network of a 300+ bed hospital about 2,000 miles away from my home office in Phoenix. I used a chrome browser and a single malicious IP address that was provided by Norse. I wrote about the details of that here ‒ Just How Secure Are IT Network In Healthcare? Spoiler‒alert, the answer to that question is not very.
I encourage everyone to read Fred’s article, of course, but the gist of his argument is that technically ‒ data encryption isn’t a simple choice and it has the potential to cause data processing delays. That can be a critical decision when the accessibility of patient records are urgently needed. It’s also a valid point to argue that the Anthem breach should not be blamed on data that was unencrypted, but the healine itself is misleading ‒ at best.
The Internet is abuzz criticizing Anthem for not encrypting its patient records. Anthem has been hacked, for those not paying attention.
Anthem was right, and the Internet is wrong. Or at least, Anthem should be “presumed innocent” on the issue. More importantly, by creating buzz around this issue, reporters are missing the real story: that multinational hacking forces are targeting large healthcare institutions.
Most lay people, clinicians and apparently, reporters, simply do not understand when encryption is helpful. They presume that encrypted records are always more secure than unencrypted records, which is simplistic and untrue.
Encryption is a mechanism that ensures that data is useless without a key, much in the same way that your car is made useless without a car key. Given this analogy, what has apparently happened to Anthem is the security equivalent to a car-jacking.
When someone uses a gun to threaten a person into handing over both the car and the car keys needed to make that care useless, no one says “well that car manufacturer needs to invest in more secure keys”.
In general, systems that rely on keys to protect assets are useless once the bad guy gets ahold of the keys. Apparently, whoever hacked Anthem was able to crack the system open enough to gain “programmer access”. Without knowing precisely what that means, it is fair to assume that even in a given system implementing “encryption-at-rest”, the programmers have the keys. Typically it is the programmer that hands out the keys.
Most of the time, hackers seek to “go around” encryption. Suggesting that we use more encryption or suggesting that we should use it differently is only useful when “going around it” is not simple. In this case, that is what happened.