Anthem Was Right Not to Encrypt

Anthem Was Right Not to Encrypt

53
SHARE

Optimized-FredTrotterThe Internet is abuzz criticizing Anthem for not encrypting its patient records. Anthem has been hacked, for those not paying attention.

Anthem was right, and the Internet is wrong. Or at least, Anthem should be “presumed innocent” on the issue. More importantly, by creating buzz around this issue, reporters are missing the real story: that multinational hacking forces are targeting large healthcare institutions.

Most lay people, clinicians and apparently, reporters, simply do not understand when encryption is helpful. They presume that encrypted records are always more secure than unencrypted records, which is simplistic and untrue.

Encryption is a mechanism that ensures that data is useless without a key, much in the same way that your car is made useless without a car key. Given this analogy, what has apparently happened to Anthem is the security equivalent to a car-jacking.

When someone uses a gun to threaten a person into handing over both the car and the car keys needed to make that care useless, no one says “well that car manufacturer needs to invest in more secure keys”.

In general, systems that rely on keys to protect assets are useless once the bad guy gets ahold of the keys. Apparently, whoever hacked Anthem was able to crack the system open enough to gain “programmer access”. Without knowing precisely what that means, it is fair to assume that even in a given system implementing “encryption-at-rest”, the programmers have the keys. Typically it is the programmer that hands out the keys.

Most of the time, hackers seek to “go around” encryption. Suggesting that we use more encryption or suggesting that we should use it differently is only useful when “going around it” is not simple. In this case, that is what happened.

The average lay person, as well as the average clinician, do not bother to think carefully about security generally. Making an investment in the wrong set of defenses serves to decrease and not increase the overall security of the system. This argument is at the heart of the arguments against the TSA, which serves to make us “feel” more secure without actually increasing our security. The phrase for this is “Security Theater”.

You see encryption at rest, unlike encryption in transit, comes with significant risks. The first risk is that keys might be lost. Unlike car keys, once encryption keys are lost there is no way to “make new ones”. Of course you could backup your keys, securely, off-site, but that is extra costs, extra steps. Second, if encrypted data becomes corrupted, it is much more difficult to recover than unencrypted data.

In short, there are cases where encryption-at-rest can be dangerous and there are only a few cases where it can be helpful.

For clinicians, it is easy to make a parallel: the risks associated with unneeded testing. A lay person assumes that if there is any chance that the “CAT scan might catch it” then they should have a CAT scan. The clinician understand that this tests comes with a cost (i.e. increased long-term cancer risk) and is not as “free” as the patient feels it is. The public only becomes aware of this when a test scandal occurs like the famous PSA test, where the harm was massively larger than the good provided by a given test.

Both “Human Body” and “Information Technology” are both complex systems, and in general do not respond well at all to oversimplified interventions.

Moving back to Anthem.

Anthem has a responsibility, under HIPAA, to ensure that records remain accessible. That is much easier to do with unencrypted data. The fact that this data was not encrypted means very little. There is little that would have stopped a hacker with the level of access that these hackers achieved. Encryption probably would not have helped.

By focusing on the encryption at rest issue, the mainstream press is missing the main story here. If indeed Anthem was targeted by sophisticated international hackers, then there is little that could have been done to stop them. In fact, assuming international actors where involved, this is not as much as failure for Anthem as a failure of the NSA, who is the government agency tasked with both protecting US resources and attacking other nations resources.

As much as the NSA has been criticized for surveilling americans, it is their failure to protect against foreign hackers that should be frequent news. Currently, the NSA continues to employ a strategy where they do not give US companies all of the information that they could use to protect themselves, but instead reserve some information to ensure that they can break into foreign computer systems. This is a point that Snowden, and other critics like Bruce Schneier continue hammer: the NSA makes it easy to spy, for themselves and for others too.

It is fine to be outraged at Anthem and I am sure they could have done more, but I can assure you that no insurance company or hospital in the United States is prepared to defend against nation-state level attacks on our infrastructure. In fact, Anthem is to be applauded for detecting and cutting off the attack that it did find. Hackers are much like roaches, if you can spot one, there are likely dozens more successfully hiding in the walls.

Leave a Reply

53 Comments on "Anthem Was Right Not to Encrypt"


Guest
anonHCB
Feb 25, 2015

Just because encryption may not have prevented this attack does not mean Anthem was right not to encrypt. Anthem was wrong not to, regardless of whether it even played a role in all this.

Guest
Jon
Feb 25, 2015

Mm. Sorry. There is no excuse for not encrypting SSN at rest. NONE. Also they have admitted to subpar logging techniques, no multi-factor authentication, and subpar intrusion monitoring. Stop giving them a pass. They skimped on security and got hit. They should have to pay for it. Unfortunately with only 2 years of credit monitoring. In my opinion, they should be hit with the requirement to provide lifetime credit monitoring for all involved. Raise the bar of punishment and companies will start acting professional.

By the way, your car jacking analogy is weak. Data encryption at rest can happen reliably. Split keys and spread it around. Data/multi-key sharding. There are many possibilities. Think your car will get jacked if there is no battery installed, no tires, and no fuses? You just raised the barrier.

Guest
Feb 23, 2015

Anthem’s admission that it did not encrypt this database has, oddly, occupied everyone’s attention even after Anthem admitted to other things it did not do–which are far more relevant to whether the company could have prevented this attack. As I wrote about on Feb. 14, http://www.ibj.com/articles/51789, Anthem told its employer clients that it did not use multi-factor authentication throughout its IT systems (even though every ATM in America uses that concept and even though Medicare requires its use around all sensitive data) and it did not employ user behavior analytics. That second concept would have given Anthem a much better chance of noticing that someone was transferring an estimated 35 gigabytes of data out of its system over the course of seven weeks. One cybersecurity expert described the fact that Anthem did not detect that much data leaking out of its system over such a long period of time as “outrageous” and “shocking.” Encryption most likely would not prevented this data breach, although it might have slowed the hackers down a bit, giving Anthem a better chance at detecting their activity. Anthem’s lack of encryption seems most significant as a signal that it was not doing everything it could to protect consumers’ data. Encryption is a standard feature on the most common database software, and encryption at rest adds about an 8 percent delay in processing time. So the price of encrypting seems fairly small given the size of the risk. Then again, hindsight is always 20-20.

Guest
Feb 10, 2015

“Many people have been surprised to hear that this sensitive data was not encrypted and that the federal mandate for securing health-related data, HIPAA, does not require it to be. In fact, HIPAA only “strongly encourages” encryption. Organizations that choose not to use encryption are supposed to document the reasons why not and implement an “equivalent alternative measure if reasonable and appropriate.” The vagueness of this requirement is the crux of class action and other lawsuits being filed against Anthem.

But even if Anthem had used encryption, the data could have still have been compromised. Encryption is just one part of the arsenal that organizations need to deploy to secure sensitive data. Encryption is great for securing data in transit and at rest, but if the credentials and keys are compromised it does little to protect the data.

The bigger issue in many breaches is that organizations haven’t properly implemented data access security controls. They need to have safeguards in place in case attackers can bypass perimeter defenses and compromise administrator level credentials.”

http://www.technologyreview.com/view/535111/encryption-wouldnt-have-stopped-anthems-data-breach/

Guest
Stoney DeVille
Feb 11, 2015

” … but if the credentials and keys are compromised it does little to protect the data.”

This is a next to impossible scenario when the data, encryption tool, key, and secret are stored in separate locations…. a common design, but again something few people do, so few people know. Not storing them separately would be like having a pile of money, with a lock sitting on top of it, and the key in the lock. Your scenario of all items compromised (as I pointed out in my comment) would require an inside job.

Guest
Stoney DeVille
Feb 10, 2015

If Anthem is not cypher-texting my demographic data into the database tables then yes… I blame them and everyone else who is not “taking the trouble” to do so. It’s not that hard, it’s just not a priority for most healthcare companies because it’s “extra work”. Not ALL of the data needs to be encrypted… just the identifying attributes. This renders the rest of the health data de-identified. Encrypted discs, connections, and database files are useless when someone gains access to database login credentials. It is very easy to write a Trojan horse that looks for databases, tables, and column names like… Insurance, Patient, SSN, etc. Then still easy enough to wrap that data up and send out to a web server somewhere. But it is very extremely difficult to decrypt this data if it’s encrypted with AES_256 keys stored somewhere else in the network, and salted with additional values compiled in an application somewhere else on the network. To brute force decrypt this stolen data would be very expensive in compute resources from the electrical bill alone. The thief would have to decide if the data was worth the cost to decrypt. In some cases it would be. It’s all about lowering risk. Cyphering the demographic data with a key stored elsewhere and executed by an application compiled elsewhere would be nearly impossible to hack without absolute help on the inside. This would then not be a “sophisticated” hack, it would be an inside job. Under HIPAA, demographic data should be shown only when absolutely necessary to a care giver who has an actual use for the information. This is usually handled by the front-end software and rarely shown all together. Sometimes the argument for not encrypting demographic information is related to the need to easily search the data. I have handled this by decrypting the data in memory then searching, and with modern servers this can perform well. I also perform patient matching and de-duping using similar in-memory techniques.

Guest
Stoney DeVille
Feb 10, 2015

I am not assuming to know the root cause of the breach at Anthem. What I detail is still a major potential issue at most healthcare organizations.

Guest
Feb 10, 2015

In the end – encryption had little to do with the breach. The roots of the Anthem breach go as far back as possibly April of 2014 – which may link it to the CHS breach – and the same vulnerability – the Heartbleed bug. This isn’t Anthem’s first bite at the HHS penalty apple either. They paid $1.7M (under Wellpoint banner) to HHS in 2013 – for 612,000 records that were accessible over the internet – between 2009 and 2010.

Are The Data Breaches At Anthem And CHS Linked? http://onforb.es/1CeAiKD via @forbes

Guest
Feb 10, 2015

My wife and I are not even Anthem customers, but her employer (where we get our health insurance) has notified them that if we’ve used providers who also take Anthem insurance, we may have been picked up in the hack. That is infuriating. Goes to some of Adrian Gropper’s points above (minimization, and persistence). If I’m not an Anthem subscriber, what right do they have to my personal information from a provider I’ve seen who also contracts with them? I would think that Anthem’s competitors will not be amused to know that they’re mining the data of THEIR customers as well.

Guest
Feb 16, 2015

Thanks for commenting Bobby… its always nice to hear from you.

If I understand what you are saying correctly here, you might consider publishing a “blurry version evidence letter” on your blog. This is the first I had heard that non-Anthem members might be included. If that is true it is a very very big deal.

-FT

Guest
Feb 16, 2015

It IS a big deal. That’s what employees at Cheryl’s company were told in the wake of the breach notice — that if our treating provider(s) also took Anthem insurance, they (Anthem) may also have our data as well, notwithstanding that they are not HIPAA CE’s with respect to us, and have no legal right to our PHI / PII.

Guest
Feb 17, 2015

I expect that there will be a lot of talk on the matter, and I also expect that Anthem will have resisted putting anything like this in writing.

If you do have something in writing… well that would make for good fun wouldn’t it?

-FT

Guest
Thomas Lukasik
Feb 10, 2015

>> RE “They presume that encrypted records are always more secure than encrypted records, which is simplistic and untrue.”

This appears to be a typo — shouldn’t it read: “They presume that encrypted records are always more secure than un-encrypted records..”?

TJL

Guest
Feb 16, 2015

I think my editors might have fixed this for us… thank you for pointing it out…

Guest
Thomas Lukasik
Feb 16, 2015

NP. And yes, it was fixed pretty quickly — maybe you can just delete my (now obsolete) comment.

Guest
Feb 17, 2015

No I think not. Since the error was copied to other blogs. You deserve credit for finding it. My editors deserve credit for fixing it, and I deserve the blame for the original mistake.

Whoops.

-FT

Guest
Feb 10, 2015

Fred,
Nicely written post. There’s clearly a balance of responsibility here, and I expect we will learn that Anthem could have done more. It is time that major health care institutions build those higher walls and deeper moats–and the cost in dollars and inefficiency will be a cost of doing business.

Also, nice to see this linked on Morning Consult.

Guest
Feb 10, 2015

Wow, this is an interesting perspective to say the least. Let me give a technologist’s response.

Encryption is a valuable technology if you understand what it is and what it is not. Encryption at rest is certainly a best practice for hospitals.

At the strict end of the scale, encrypting at rest can mean a different key for every file or even block stored on media. This is generally considered impracticable, but it can be done. On the other end of the scale, whole disk encryption uses the premise that a single key (ideally entered at boot) accesses the entire data set.

Whole disk encryption is very common. It is most useful when you need to protect media that may be stolen, i.e. when a thief enters your data center or when a laptop is removed from the boot of your car.

Whole disk is of basically no value against online penetration attacks viz a viz the attack against Anthem. Once the attacker has administrator access all files are in the clear.

Did Anthem provide whole disk encryption? That doesn’t appear to be clear. Whole disk encryption would certainly meet HIPAA/ requirements and show intent to protect. It wouldn’t help much in this specific situation.

So, what about single file or even directory encryption on top of whole disk encryption? It can be done. However, unless the keys are managed in such a way that the systems administrator cannot obtain them at will, it is of no value. In most hospitals the infrastructure and budget for such a system does not exist. Practically speaking you are more likely to see this kind of approach in a defense environment.

So what’s right? Well, whole disk encryption is base protection that should always be there. It is valuable and necessary to prevent against physical theft. More complex forms of encryption such as file and directory level encryption must be combined with software and systems administration practices that support their use, and frankly, there is a huge skills gap in this area.

Unfortunately, Anthem was wrong and I think HHS will inform them of that in no uncertain terms.

Guest
Feb 16, 2015

I still maintain that encrypting at REST is only sometimes “best practice”. It is certainly not common practice, and it certainly would not have been helpful here.

Insurance companies need to run reports regarding large numbers of patients all of the time. The person whose credentials were hacked appears to have that type of role… So even the subtle partial encryption methods you are describing here would not have helped much.

It will be very interesting to see if HHS says anything about the Anthem situation.. as that will be telling about what they think “best practices are”. That is a great point!

-FT

Guest
Stoney DeVille
Feb 17, 2015

Fred,
Can you please point us to the “Root Cause” documentation of this breach at Anthem? Something that shows exactly how this data was stolen including versions of operating systems, databases, and password policies, etc. Without a clear understanding it is really difficult to make conclusions on what would and would not have helped SPECIFICALLY in this case. Generally encrypting “properly” both in transit and at rest is a best practiced as defined by HIPAA regulation and any cypher professional. Unfortunately the guidance provided by HIPAA law is not specific enough for “newbies” and is also a bit out dated. My experience in healthcare sees a minimal effort exerted by Health IT companies. I wish this was not true.

Guest
Feb 16, 2015

“This decision will depend on a variety of factors, such as, among others, the entity’s risk analysis, risk mitigation strategy, what security measures are already in place, and the cost of implementation. The decisions that a covered entity makes regarding addressable specifications must be documented in writing. The written documentation should include the factors considered as well as the results of the risk assessment on which the decision was based.”

Is it clear? No. But if you chose not to encrypt at rest you better write down WHY and get good counsel.

Guest
Saurabh Jha
Feb 10, 2015

There is one word for this: fragile.

EMR will turn out to be one of civilization’s most fragile innovations.

Guest
Feb 10, 2015

These data were filched out of an EMR? Ya learn something every day.

Guest
Saurabh Jha
Feb 10, 2015

No, but EMR will have its own problems, which this episode highlights.

Integrated systems tend to be fragile. They offer several advantages though.

Guest
Tom Burton
Feb 10, 2015

Anthem had a duty to protect the personally identifiable information (PII) of its customers. Encryption at rest is a minimum “best practice” for Social Security numbers and dates of birth. Other techniques, such as tokenization, could also have been employed to protect PII. Unfortunately, we will never know if these techniques would have protected our PII. If Anthem had used them I would be more willing to cut them some slack.

By the way, there is a reason that cars come with keys. While keys do not always stop the theft of an automobile, how much worse would car theft be if we lived in a world in which car keys never existed? Do you leave your house unlocked? After all, with your logic the thieves are going to get in anyway.

Guest
Feb 16, 2015

You are extending a metaphor way past the point I would defend it.

My whole point is that encryption at Rest is only sometimes the “best practice”.

Merely claiming, briefly, that something is the best practice doesn’t make it so.

Guest
Jon
Feb 25, 2015

I call BS. Best practices are not subjective. They either are or are not a best practice. Encrypting SSN at rest is most definitely a best practice that always makes sense.

Guest
Feb 9, 2015

Nice mention of this post in Medcity News 🙂

http://medcitynews.com/2015/02/encryption-might-stopped-anthem-hack/

I am withholding judgement until more information is available on the specifics of Anthem, but I would say that encryption is generally a good thing even if it would not have stopped them in this specific hack. A maintain it should be a requirement for healthcare data to be encrypted at rest (however see ars technica http://arstechnica.com/security/2015/02/why-even-strong-crypto-wouldnt-protect-ssns-exposed-in-anthem-breach/)

I also strongly agree with Michael Turpin – a national patient identifier can at least stem the bleeding in these cases. SSN should not be used in healthcare unless they are disability claims, and even then may not be completely necessary…

Guest
Feb 9, 2015

Change patent identifiers to exclude social security numbers
and you solve for the insurer issue. I’m pretty certain the plans
don’t need SSNs to delineate patients. As for the failing
of the NSA, buckle up America, this is just the beginning. The digital
Age is fraught with unintended consequences. I’m not
sure that we can lay the crime at the feet of anyone. The key
is risk mitigation. I wonder if anyone shorted Anthem prior
to the breach. Now that would be interesting….

Guest
Feb 9, 2015

See my reply to Peggy…