Do no harm: an oath for health IT developers

01.08.2016
As health professionals, nurses, doctors, and even pharmacists are held to a high standard of making sure everything they do is above board. They can lose license for failing to comply with ethical guidelines. Even though software engineers in health IT have a far greater reaching impact on patients, no equivalent code of conduct exists for developers.

The National Institute of Health (NIH) recently granted the Mayo Clinic $142-million to create a biobank as part of the Precision Medicine Initiative Cohort Program. Aiming to enroll at least a million volunteers willing to share their health data in order to advance precision medicine, the program serves as a reminder of the security risks is health IT yet security in the health care sector continues to lag behind.

Collecting health data is moving fast, which begs the question should health IT programmers working on similar projects be held to the same ethical standards as doctors and other medical professionals 

In order to prioritize security in health IT, programmers  should be required to take the Hippocratic oath just as health professionals do, especially as more biobanks are created.

"Software engineers and physicians need to work together to ensure the health and safety of patients first and the ingenuity of efficient health technology second," said Dr. Andrew Boyd, assistant professor in the department of Biomedical and Health Information Sciences at the University of Illinois at Chicago. 

"Algorithms are literally impacting millions of lives, and there needs to be a better way to empower developers to say this might be legal but this isn't doing right by the patient," said Boyd. A strong advocate for developers being held to the same professional standards of ethics as health care providers, Boyd said that security in health IT is a huge concern.

The same conclusion was drawn from a study released by Independent Security Evaluators (ISE) earlier this year. Ted Harrington, executive partner, ISE said, "When I think about what our research demonstrated, it is that the fundamental business function in health care isn't consistent with the Hippocratic oath."

In all of the hallways of every hospital Harrington visited, it was clear that those who deliver the care follow this ethical practice in terms of interaction with patients, protocols, and sanitization to ensure that patients don't leave more sick than when they arrived.

"In a cyber context, there are so many ways in which a patient could suffer harm or fatality," Harrington said, which is why key parallels pertaining to threat modeling can be drawn between hospitals and biobanks. 

"The primary assets that I would envision are protected by biobanks--repositories for human samples for use in medical research--could be compromised," Harrington said.

Requiring software engineers to take the developers' equivalent of the Hippocratic oath, said Harrington, "Would realign their priorities to patient health. On time delivery, hitting 'go to market' timelines, cost considerations. These are all business decisions related to the development of that solution."

Developers need to be cognizant of those things, but the development practices should be considered with an awareness that what they are building could impact patient health.

The risks to patient health, explained Michael Borohovski, CEO and co-founder, Tinfoil Security, extent beyond actually causing the patient harm or pain.

"Imagine for a moment that there was a test for pancreatic cancer, wasn’t well test and the false negative rate was pretty high. 50/50 right/wrong. If that were the case and patients rely on it, now they go for another year potentially living with cancer not knowing that they have it. Not actively harming a patient by being mistaken on diagnosis or testing.

Mistakes that are made due to speed with a primary focus on rushing to market, particularly with the study of human genomes, can have serious damages to patients, but the business goal for developers is make a profit in addition to helping people.

"The Hippocratic oath might be a bit of a stretch. . It’s a little different in that doctors are exclusively there to help patients. They don’t have a duty to share holders. Their duty is to shareholders not to the patients or to the people whose data they store. Implicit in that duty to shareholders there is the responsibility to find and patch vulnerabilities," Borohovski said.

What needs to change, then, is the culture around security. Given that no software can ever have 100% security, "Companies need to adopt a culture of responding to security vulnerabilities quickly and with vengeance," he continued. 

The current culture and restrictions on security researchers, Borohovski said, "Don't incentivize researchers to be ethical. Reporting a vulnerability could get you thrown in jail." 

For developers that are working with sensitive data or storing sensitive data, it behooves them to do everything they can to find vulnerabilities. "Redefining the culture to make it easier to report will allow researchers to make more concerted efforts to find vulnerabilities," Borohovski said.

Calling for a change in culture as opposed to holding developers to a higher ethical standard might be an easy scapegoat, though. 

Grant Elliott, founder and CEO of Ostendio said, "We would simply be happy for them to meet general industry standards. Healthcare as an industry is significantly behind. The imperative or incentive to try and meet these basic security requirements doesn’t seem to be as urgent for many reasons."

 In the health care industry, the correlation of security risks are not as clear as they are in other sectors, like retail or banking. People know of the Target breach, so they can avoid shopping at Target, there is an obvious bottom line impact, said Elliott.

"That association isn't as clear in health care," said Elliott. "There are a lot of things that are done in the name of good medicine and in the name of the patient. For doctors, nurses, physicians, their first priority is patient wellness, and they need ready access to data. Any security controls can possible get in the way and hinder their core purpose."

How do developers go about fixing the issue when there really is no incentive for them to do so Elliott said, "What is the incentive to impact change Who is enforcing them to do this well There is obviously some regulatory component, but who is making sure that when they build product they are building in security from day one"

Unfortunately, there are many developers right now that won’t do anything unless they are forced to do it, said Elliott. "Many will do the minimum they have to do. Fundamentally the smaller companies need to try to get larger organizations to have a much more aggressive process that will trickle down," he continued.

While vendors continue to profit from rushing products to market, patients--whether it is their data, health, or cells--will remain at risk. 

(www.csoonline.com)

Kacy Zurkus

Zur Startseite