Uncategorized

For Healthcare Cybersecurity the Whole is Weaker Than the Sum of the Parts

flying cadeuciiBefore addressing the special attractions and vulnerabilities of healthcare data and software, a little background on cybersecurity of complex systems may be helpful: The single most important lesson from our experiences with conventional networked systems is that all of them can be hacked, and all will eventually be hacked. There’s a simple equation for hackers: their investments are related to the value of the data. Alas, because electronic health records (EHRs) have a relatively high value to criminals, we should expect hackers to make significant efforts to penetrate EHRs. (More on this later.) Our experience also teaches us that erecting protections to mitigate hacking is never by itself an adequate defense. Instead, it is always necessary for health IT leaders to make significant efforts monitoring the EHR system for unanticipated behavior. Equally critical, it’s always necessary to plan how to respond to detected attacks.

Two mistakes: One of the biggest mistakes organizations make is failing to understand the threat; organizations typically are uninformed about the sophistication and resources of attackers, on one hand, and so underestimate their opponents, while on the other, they assume their systems are much less vulnerable than they actually are.


Hackers’ Skills and Patience: Hackers possess a spectrum of skill levels and levels of willingness to invest resources into creating successful attacks. Attacker expertise ranges from that of computing beginners to highly experienced, and even to gifted professional engineers and scientists. Hackers may possess skills to compromise network protocols, application software, system software, hardware, hardware components, the enterprise’s authorized staff, and any combination of the above. Depending on the perceived value of the system and its data, hackers’ interest in investing in attacks can range from casual to well-funded long-term engineering efforts that achieve payoff only after months of patiently probing the system and then creating and installing custom software explicitly designed to exploit unique system features. Today’s criminal enterprises have been observed to invest up to a few million dollars and even tens of years of professional engineering time (years of person-days) and equipment spread over months or even years to compromise a single device.

All systems harbor vulnerabilities, and so it is only a matter of patience, experimentation, and engineering effort to discover these vulnerabilities. Once discovered, hackers can then craft exploits that take advantage of the found defects. As a consequence, modern attacks by professionals tend to take place in surreptitious stages over days, weeks, or months, whereby the attacker gathers the information needed to gain the desired level of access in the next stage of the attack. The other side of this process is that those in charge of protecting the systems can (must) diligently monitor the systems to detect and then respond to these kinds of attacks before the hackers’ ultimate end is obtained.

Our People and the System as a Whole: Another lesson from conventional computing is we frequently fail to understand that the personnel who operate and use the system are an important aspect of the system’s defenses. This is because security is a global system property emergent from how all the components have been integrated together. In other words, it’s foolhardy to think of security as just protecting each of the parts. Instead, we must seek to protect the entire system/network, where all of the parts are interrelated and interacting. Vulnerabilities exist in the entirety of the system in addition to individually. More, we are always making trade-offs against all the system’s other global emergent properties, such as functionality, performance, energy usage, and usability. Since the security achieved is the result of trade-offs, whatever technical mitigation is missing must be made up by how the system is operated, i.e., by the personnel operating and using the system. Consequently, in most systems, most unmitigated vulnerabilities are defended by users conforming to social norms. But by definition hackers do not so conform. Hence it is important for the operating staff and user communities to understand and commit to the organization’s policies and to be sufficiently informed to watch for and report discrepancies between the system’s expected and real behavior. This means that it is implausible to expect personnel to do their part simply by being told to do so. Except for a few employees directly focused on cybersecurity, a user’s first perceived job is her/his job, not the protection of the system.

Equally essential, organizations must develop security policies that make sense. Asking employees to log in to a system with elaborate codes, badges, biometrics, et cetera 200 or 300 times a day just generates circumventions—not because of laxness or laziness, but because they are just trying to do their jobs and fulfill the mission of the organization. Nurses and physicians who must go through elaborate authentication processes (log-ons) with frequently changing passwords for six separate devices and hundreds of times each day will find passwords attached to each device with yellow stickies, and will develop methods for keeping the machine available at all times. Organizations have an obligation to consider the real-world consequences of cybersecurity rules, and adjust them as circumstances are modified. If it takes a new employee a month to get a password when she/he changes positions or units, someone will find a way to allow that employee to work and/or will see the cybersecurity staff as insensitive or worse. Similarly, security procedures should follow logical rules: a computer in a triple-locked operating room may not need the same level of manual log-in authentication as one in a hallway, although it may need better defenses against hackers from the Internet. And putting a lower fat cookie recipe behind an organization’s extensive firewall both invites vulnerabilities as, for example, when an employee wants to share the recipe with his aunt or wife by providing a password. Perhaps worse, it corrodes the users’/employees’ sense of responsibility and faith in the computer security team. Users and operators therefore require training and dialogue about the system and its policies to successfully also focus on security. And this training must be refreshed and made understandable as the system evolves when new subsystems are brought on-line.

With that background, we now focus on healthcare data and systems:

Healthcare data are especially difficult to protect: There are several notable challenges electronic health records present:

First today’s medical infrastructure has been adapted from conventional computing, and was not been designed with the needs of health care expressly in mind. Hence the current generation of micro-processor-based health care equipment is usually better suited to office-like and traditional data processing workflows than to clinical environments. As noted, this is especially clear in the use of passwords as an access control mechanism. Using equipment only partially adapted from a work-flow alien to the clinical setting thus leaves gaps that system users have to cover with their own behavior, and each such gap creates potential attack points. Evolving the system’s workflow model away from the office and toward the realities of clinical practice is a major design challenge with significant opportunities to improve cybersecurity.

A second challenge is that many of today’s medical devices were first designed and built as stand-alone devices, not as networked components integrated into a larger system. As such, they lack the very basic functionality needed for their new role. This is analogous to when conventional computing was extended from terminals attached to time-sharing systems to networks of workstations and PCs in the early 1980s. Integrating legacy medical devices into the new networked architectures requires a large amount of new software just to provide the missing functionality. This integration process itself creates novel security problems that never existed for the stand-alone design. As an example, a stand-alone insulin pump assumes a single or at most a small number of login accounts for operators. In contrast, when networked, practically any clinician on a patient’s care team may need to log into the device, and so the authentication system needs to be re-worked from scratch to integrate into the facility’s network-wide authentication framework.

Thus the third challenge requires stand-alone medical devices to be re-architected and re-manufactured with networking and distributed computation in mind. Then they must be redeployed into the more modern electronic healthcare environments. Experiences with conventional computing systems suggest that the current system of legacy devices jury-rigged into a larger network is unlikely to withstand prolonged attacks by today’s professional criminal attackers.

A fourth and perhaps most significant challenge is that health care has evolved into specialties split across many disciplines and even organizational boundaries. A patient may have several health records which must travel across all of these several system boundaries to satisfy all of the clinical, scientific, and business objectives involved in that patient’s care and billing. As a practical matter this means that the security and privacy of an EHR is governed by the least common denominator needs of all the groups and organizations requiring access. Different organizations and groups are subject to different opportunities, incentives, and constraints, many of which contradict with one another. Thus, finding compromises that both protect the patient and enable the broad range of specialists requiring EHR access is a daunting conceptual and real life problem. This lack of uniformity in health record handling presents opportunities for attackers to exploit. Somehow the system must be better rationalized to reduce the opportunities for exploits.

A fifth challenge is that healthcare data—especially electronic health records—are worth ten to thirty times more on the black market than are credit card numbers. A cyber-thief can buy purloined credit card numbers for anywhere from about $2.50 to $10 a number (assuming one is buying in bulk). In contrast EHR data are worth up to $65.00 each. Why this difference? A credit card can be used once or a few times and then it is stopped. But a healthcare record: 1. Has your credit card number anyway in addition to your social security number; 2. Can be used for blackmail; and 3. Most important—is the gift that keeps on giving if used by an unscrupulous medical provider. Thus, a physician, or PT, or pharmacy, or oxygen supply company can bill Medicare or an insurance company for millions of dollars. And the unscrupulous vendor or provider will know exactly the kind of services, procedures or supplies appropriate for each patient. The fraud is aided by incomprehensible (and intentionally opaque?) medical billing processes and the reality that a lot of sick people are not carefully examining the Byzantine paperwork that accompanies any medical event.

A sixth challenge is created by the increasing ability of patients to directly input data into their personal health records (PHRs) and even into their EHRs. These data can be self-reports (e.g., what I ate, how I felt) or these data can come from devices, such as fit-bits, cell phones, other apps, and medical devices (e.g., heart monitors, pacemakers, insulin pumps), scales, blood pressure measurements). There are literally millions of medical apps out there. Thus, the most obvious issue is the trustworthiness of the data; Should the data be accepted into the PHR or EHR without review? Is the clinician obliged to review it?  Or to accept it?
What of diary-type entries, reflecting weight loss desires, foods not eaten, sexual activities, child care worries?

What are the legal implications of this process? Need clinicians act upon something that might be suspicious but it probably not consequential? How is it even possible to expect clinicians to find the important needles in the massive haystacks of data that could inundate their practices?

A seventh challenge is the vulnerability to data in cell phones and other personal devices, be they medical (e.g., pacemakers, insulin pumps) or possibly related to health (e.g., exercise machine records, pedometers). As noted earlier, these devices were often not designed with cybersecurity as essential. In particular, today’s hackers have the skills and expertise to discover any of these devices from the Internet, reserve engineer their software, create malware explicitly for these devices, and then install their malicious code on the targeted devices; they do all of this remotely across the Internet. Creating devices more resistant to these threats is therefore a major opportunity.

An eighth and related challenge is the safety of the data in transit from devices to one’s EHR or PHR. Cell phones operate in “open space.” Many other devices rely on cell phones or on Wi-Fi of uncertain security. (The first author of this article discovered the WEP flaws in July 2000 and subsequently served as the architect for WPA2 when he worked at Intel, work instrumental to his promotion to the role of chief cryptographer).

A different, and ninth, challenge is the motivation of both patients and the health care system’s personnel to cheat. Many wellness programs and other health insurance programs reward activities that are thought to enhance health. Thus, more walking steps per day, routine exercise, smoking cessation, or weight loss are rewarded with money or reductions in insurance premiums. Similarly, some programs punish employees on the basis of these data with higher premiums and public exposure. Of course, humans are very smart and we therefore enjoy reports of fitness trackers attached to ceiling fans, dog collars, and even electric drills. Smokers will substitute the urine of others for their own; people will carry weights and wear heavy shoes when being weighed for the baseline measurements.

The tenth challenge is the lack of data standards for healthcare data. Thus one system lists your blood pressure as 120/80, another as “diastolic of 80 and systolic of 120” (in alphabetical order), another system as “labile,” or as “not compliant with medications,” and yet another system records your blood pressure as “stable.” Even if the software sends the information from one system to another, the clinician seeking to treat you may never see the other information with words jumbled into the data fields (computers don’t handle ambiguity very well) and if the data do come to her, she can’t effectively use information called “labile” or “stable.” She needs the numbers. This data standard chaos affects security because healthcare systems are then obliged to develop workarounds to lack of data standard by transferring the information to subprograms (APIs) that help send the information across systems. But introducing additional software that often do not encrypted the data just creates additional vulnerabilities.

An eleventh challenge is that local groups of hospitals and clinician practices often create HIEs (Health Information Exchanges) that allow doctors and hospitals to access data from other participating institutions and practices. The value of these HIEs is compromised by the reality that they seldom have full participation of all local healthcare organizations. Thus searching for a patient often produces no information. Also, the fact that they are local means they will miss patients who have recently moved to an area or are just passing through. Last, lack of a unique patient IDs in the USA means they often don’t help differentiate medical information on Robert Smith, Robert J Smith, Smith RJ, Smith R, RSmith, RJSmith, etc, etc. This means that one must examine the records of often hundreds of patients in hope of finding the one you need; further exposing more patient data to many eyes.

In conclusion: Healthcare IT promises—and often delivers–faster, better, and more comprehensive medical care. But underlying those promises is the assumption that patient data in the IT systems are secure; and that the safety of the software used to collect, analyze, present and transfer that information is not easily compromised. As we have sought to explain, there are good reasons to doubt the data security of many medical data systems.

We are not suggesting that medical authorities or their IT staffs are cavalier about these dangers. They are aware, concerned, and are actively seeking to protect that information. However, the vulnerabilities we’ve noted are profoundly complex and often shifting. The number of separate systems, the age of some of those systems and the dangers of combining devices and data sources presents challenges that are severe, and constant. Worse, they are always emergent because software, hardware, consultants, patient populations, clinicians, and business relationships are seldom static. Equally disconcerting is the system vulnerably is dependent on workers at healthcare facilities and related organizations who are the targets of increasingly sophisticated hackers. Hackers send spear fishing messages that incorporate friends’ and bosses’ names and topics of great urgency including employees’ children’s names and their teachers’ names.

Data in mobile devices and in transit represent yet another set of vulnerabilities. We seek the convenience of constant cyber connectedness, but seldom consider how that connectedness provides the bad guys constant access to data and systems. Protecting our information may mean very different ways of keeping and sending data. One man’s emoticon for home is another’s entrance to his bank account.

Medical informatics has developed over the past five decades, building incrementally, expanding its purview, promises, and prestige. It will soon know more about us than we do via its access to our genetics and precision medicine’s algorithms. The security of our data is therefore even more essential even though the protection of our information is too frequently a slapdash patchwork of good intentions, private interests, some caring security engineers, and often limited resources devoted to security. As we write this, Yahoo, announced the hacking of half a billion passwords and personal information. It’s probable that Yahoo’s security systems are more robust than most peoples’ cell phones or their medical providers’ databases.

Jesse Walker received his Ph.D. in mathematics from the University of Texas at Austin in 1980. He taught at Iowa State University form 1980 to 1982 and then moved to industry, where he focused first on networking and then on cryptography and computer security. After recently retiring from Intel he joined the EECS research faculty at Oregon State University and is a visiting scholar in mathematics at Reed College.

Ross Koppel, PhD, FACMI has been at the University of Pennsylvania for 25 year, where he teaches sociology, is a Senior Fellow at LDI Wharton, and is PI on several projects involving healthcare IT and cybersecurity. He is also professor of biomedical informatics at the University at Buffalo (SUNY).

Categories: Uncategorized

3 replies »

  1. As cyberthieves become bolder, more creative and more successful, the risks to our personal information increases as said in the article. Security Rule safeguards can help health care providers avoid some of the common security gaps that could lead to cyber-attack intrusions and data loss. But organizations need to take more preventive measures against cyberattacks. Educating employees on cybersecurity and precautions to avoid cyberattacks is one of the best method to avoid data breach. Cybersecurity related online communities become a good reference for employees to get more information. I would like to suggest Opsfolio.com, an online community for those involved with healthcare cyber security, which is a right guide for me to get healthcare cybersecurity informations.

  2. This catalog of vulnerabilities obscures the real problem by accepting the current EHR architecture as a given. Just because we can use the Yahoo model to deliver email, groups, and search to half a Billion people doesn’t mean its wise to do so. We can have email without webmail, groups that don’t share servers with our email, and search engines that don’t save any personal data about us at all. Similarly, assembling 10 Million health records into one Epic honeypot is Disaster by Design.

    If health records are worth $65 each, then one 10-million patient Epic system is worth about half a $Billion.

    The relationship I have with my physician, the lab, and the pharmacy do not require my records to be co-located with 10 million others’. The institutional monoculture approach to health IT does not serve me as a patient. It’s an anachronism that will be replaced by patient-centered longitudinal health records. I call for leaders and experts like you to broaden your perspective to include health records architectures that can meet our 21st century challenges, including privacy, public health, and cybersecurity.

  3. This present era is the last and only time we can study EHR’a compared to hand-written charts, Both are still in use at the same time. A singular event.

    It may be true that hand written charts are actually a better way to go….considering everything–all the good and bad.

    Not that we will every go back…but it would be nice to find the truth. Possibly a hybrid chart is the ultimate answer.

    The problem is that EHRs have brought in many groups of new users, observers, and stakeholders–e.g. insurers and government bodies. How do we count the utility of EHRs for these new stakeholders? Perhaps the EHR and hand-written methods should only be looked at with respect to their effects on the patient?: quality of care, accuracy of diagnosis, efficiency of care, security and privacy, LOS, complications of care, misadventures in pharmacy and therapeutics, etc.

    We bring in by faith this brand new tool. It would be nice if we could compare it with the old tool. Evidence-based care.