Scott Erven is head of information security for a healthcare provider called Essentia Health, and his Friday presentation at Chicago’s Thotcon, “Just What The Doctor Ordered?” is a terrifying tour through the disastrous state of medical device security.
Wired’s Kim Zetter summarizes Erven’s research, which ranges from the security of implanted insulin pumps and defibrillators to surgical robots and MRIs. Erven and his team discovered that hospitals are full of fundamentally insecure devices, and that these insecurities are not the result of obscure bugs buried deep in their codebase (as was the case with the disastrous Heartbleed vulnerability), but rather these are incredibly stupid, incredibly easy to discover mistakes, such as hardcoded easy default passwords.
For example: Surgical robots have their own internal firewall. If you run a vulnerability scanner against that firewall, it just crashes, and leaves the robot wide open.
The backups for image repositories for X-rays and other scanning equipment have no passwords. Drug-pumps can be reprogrammed over the Internet with ease. Defibrillators can be made to deliver shocks — or to withhold them when needed.
Doctors’ instructions to administer therapies can be intercepted and replayed, adding them to other patients’ records.
You can turn off the blood fridge, crash life-support equipment and reset it to factory defaults. The devices themselves are all available on the whole hospital network, so once you compromise an employee’s laptop with a trojan, you can roam free.
You can change CT scanner parameters and cause them to over-irradiate patients.Continue reading…
T was never a star service tech at the auto dealership where he worked for more than a decade. If you lined up all the techs, he wouldn’t stand out: medium height, late-middle age, pudgy, he was as middle-of-the-pack as a guy could get.
He was exactly the type of employee that his employer’s wellness vendor said was their ideal customer. They could fix him.
A genial sort, T thought nothing of sitting with a “health coach” to have his blood pressure and blood taken, get weighed, and then use the coach’s notebook computer to answer, for the first time in his life, a health risk appraisal.
He found many of the questions oddly personal: how much did he drink, how often did he have (unprotected) sex, did he use sleeping pills or pain relievers, was he depressed, did he have many friends, did he drive faster than the speed limit? But, not wanting to rock the boat, and anxious to the $100/month bonus that came with being in the wellness program, he coughed up this personal information.
The feedback T got, in the form of a letter sent to both his home and his company mailbox, was that he should lose weight, lower his cholesterol and blood pressure, and keep an eye on his blood sugar. Then, came the perfect storm that T never saw developing.
His dealership started cutting employees a month later. In the blink of an eye, a decade of service ended with a “thanks, it’s been nice to know you” letter and a few months of severance.
T found the timing of dismissal to be strangely coincidental with the incentivized disclosure of his health information.
Secrecy breeds suspicion. The role of secrecy in health care is practically non-existent so when we see examples of secrecy, as in the operational details of the Federal Data Services Hub, we get the recent outcry from a range of politicians and journalists waving privacy flags. For Patient Privacy Rights, this is a teachable moment relative to both advocates and detractors of the Affordable Care Act.
There’s a clear parallel between the recent concerns around NSA communications surveillance and health care surveillance under the ACA. Some surveillance is justified, to combat terrorism and fraud respectively, but unwarranted secrecy breeds suspicion and may not help our civil society.
“The Hub” is described by the government as:
“For all marketplaces, CMS [the Centers for Medicare and Medicaid Services] is also building a tool called the Data Services Hub to help with verifying applicant information used to determine eligibility for enrollment in qualified health plans and insurance affordability programs. The hub will provide one connection to the common federal data sources (including but not limited to SSA, IRS, DHS) needed to verify consumer application information for income, citizenship, immigration status, access to minimum essential coverage, etc.
CMS has completed the technical design, and reference architecture for this work, is establishing a cross-agency security framework as well as the protocols for connectivity, and has begun testing the hub. The hub will not store consumer information, but will securely transmit data between state and federal systems to verify consumer application information. Protecting the privacy of individuals remains the highest priority of CMS.”
Here’s where the secrecy comes in: I tried to find out some specific information about the Hub. Technical or policy details that would enable one to apply Fair Information Practice Principles? Some open evidence of privacy by design? Some evidence of participation by privacy experts? I got nothing. Where’s Mr. Snowden when we need him?