Uncategorized

Your Personal Health Data is Not Safe

Electronic medical records are an incredible boon to healthcare. When necessary, doctors can obtain important information such as your

allergies, medical history, and known conditions, which can make all the difference in an emergency. But letting that information fall into the wrong hands could be a serious problem.

Regulations such as HIPAA aim to promote a super-high standard of security for personal medical information, with massive fines for failure. But a fine for security failure doesn’t necessarily create security success. Doctors and medical organizations rely on software vendors for secure systems, and as we’ve seen, software can be buggy. Worse, the medical organizations don’t have the knowledge to use the secure systems correctly and keep them disconnected from insecure systems.

Seth Fogie, Information Security Director for Penn Medicine, performed what he called an on-screen biopsy of healthcare security in the US for Black Hat attendees. It wasn’t pretty.

As Foglie introduced himself, he noted that he had presented at Black Hat 16 years ago on the topic of Pocket PC security abuse. That seems dated today, but as he pointed out, Windows CE and other antiquated, insecure systems are still used in the healthcare industry.

“Patient records are being exploited and sold,” explained Foglie. “There is monetary value.”

In the security business, you often hear about zero-day vulnerabilities—security holes that are so new, nobody has seen them before. Foglie characterized the health industry’s problems as one-day vulnerabilities. They’re known, but they’re not fixed.

“H-ISAC (Health Information Sharing and Analysis Center) is aware, the vendors are aware, but there’s no guarantee of remediation,” said Foglie. He noted that no vendor names will appear in his talk. “My aim is to bring awareness to the public, guidance to the vendors, and insight for security folks.”

Foglie cast his engaging presentation as a story about a visit by Alice and Bob to the Black Hat Clinic. Security wonks will remember Alice and Bob from the original cryptographic paper in which Rivest, Shamir, and Adelman laid the groundwork for public key encryption. Now they’re much older, and Bob needs attention at the clinic.

Drawing on his actual experience testing security, Foglie examined seven distinct types of medical systems that could be compromised, some with disastrous results. The story begins with an unfamiliar face appearing on the TV in Bob’s room and making a vague threat. How could that happen? Turns out it’s not a TV; it’s a Patient Entertainment System. As such, it can handle meal orders, accept screencasts from doctors, and more. And it’s not secure.

Medical staff these days use clinical productivity software. Doctors’ notes go into it, as do insurance coding data, patient instructions, and more. Foglie found a backdoor that gave access to more than 100,000 patient records.

Drug dispensing and monitoring must surely be the most secure, right? Well, no. Foglie found an easy way in. “We could dump usernames and passwords, ” he explained, “and gain access to the drug distribution system. We could add ourselves as a user at any level. What a headache! We could even steal some acetaminophen.”

Foglie noted that the vendor fixed this one right away, and that they didn’t really steal any headache pills.

The litany went on. Foglie found flaws in the temperature-monitoring system that could let a malefactor take control, resulting in ineffective meds or even poisoning. That Nurse Call system? It’s not just a buzzer, it’s a full-scale app, and it has a hard-coded backdoor password. As for the imaging system, they got access easily by tweaking the code to accept only the wrong password. Finally, Foglie and his team gained full access to the “Downtime Device” that provides local information to a clinic when its datacenter is unavailable.

“That’s 225,000 patient records compromised, with little effort,” concluded Foglie. “That could be worth $2,250,000 or even $225,000,000. If we extrapolate this out, well, I probably could have named this a trillion-dollar issue.” You might think finding security holes in medical devices and apps would take months of painstaking work, but it isn’t so. Foglie and his team spend two to four hours looking for specific security red flags, and all too often they find them. Among the things they look for are hard-coded backdoor passwords, which all too often contain the word “backdoor.” Seriously! Authentication that just takes place on the local device is another problem, because it’s easily hacked. With simple tools, a testing team can view source code for apps and even modify them in place.

Foglie encouraged health care security teams to use Penn Med’s red flag techniques. “If you have an opportunity when youre out there doing a pen-test, look at the applications,” he said. “You may find something interesting.” He concluded with a plea to healthcare application vendors. “We’re talking about patient care here,” he said, “so this is a patient data privacy and security issue. Don’t make our job harder!”

You might think finding security holes in medical devices and apps would take months of work, but it isn’t so.

Leave a Reply

Your email address will not be published. Required fields are marked *