Advertisement

Medical devices are only as secure as the software they run on and it has been well-documented that software security remains an issue in this critical domain. The U.S. FDA and Department of Homeland Security are actively addressing vulnerabilities and raising awareness, but we remain in a reactive, “discover-patch-release” mode. In fact, the current, booming market for cybersecurity professionals in general appears to be based on a reactive stance. There’s a better way – the time-tested ounce of prevention.

Many if not most medical device software vulnerabilities – those actually exploited in current attacks – I would characterize as implementation errors in the software. They’re not design errors. They’re basically mistakes in programming. Hackers exploit these mistakes, enabling them to take over systems including, unfortunately, medical devices. The good news is that, to a large degree, these programming errors are avoidable.

My colleague Tom Haigh and I have written and the IEEE Cybersecurity Initiative has published “Building Code for Medical Device Software Security” to address the issue. This modest paper grew out of a workshop on the topic that we held in New Orleans in November 2014. To some extent the paper presents a set of components that represent the consensus of the group that attended that workshop.

We convened 40 volunteers from fields, including cybersecurity, programming languages, software engineering, medical device development, medical device standards and medical device regulation. After weeks of online collaboration, we met in New Orleans last November for two busy days under the sponsorship of the IEEE Cybersecurity Initiative and the National Science Foundation.

This publication is only a beginning. We hope to establish a process for revising it and augmenting it via feedback from readers. And we’d like to see industry players arrive at a consensus set of best practices based on its structure. I think readers should read it with an eye to how they would improve it. What’s there that doesn’t need to be and what’s not there that does need to be? 

The “Building Code” Analogy
Though every industry relies on software in one way or another, clearly medical devices have the potential to harm users, or worse. That’s why we addressed this area first with our “building code” approach.

Why the “building code” analogy?

The government in the past has tried to improve IT security through some mild forms of regulation. We think that it will be more effective to establish best practices through a consensus of industry and professional society participants. Building codes provide a model for this process. Architects and builders, not government, largely bear the responsibility to develop and maintain the code.

Historically, building codes have been created in response to disasters. The Great Fire of London in 1666, for instance, led to a building code that required stone facing and a certain width to streets to prevent widespread conflagrations. We think the analogy for software is apt. We need to build our software out of bricks that are sound and that aren’t so easy to set on fire, so to say. And we need software practices that can be validated fairly easily. Thus, automated ways of checking software integrity are on the rise.

Adopting Best Practices: A Challenge?
Getting industry to coalesce around one set of best practices will be challenging. OEMs should be concerned about their software supply chain. Customer demand for better security is crucial, and I’ve been surprised and pleased to learn that at least one well-known clinic is actually doing some of their own penetration testing on devices that they purchase. They’ve also developed contract language telling their vendors that they need to comply with certain procedures and that they expect them to assume certain liability. When a customer of a certain stature speaks up, OEMs will listen. Likewise, on an individual level, the adoption of best practices may well occur if failure to do so opens the programmer or others in the supply chain to liability and penalties.

This should be a “no-brainer.” This is simply an effort to mature the software development process that goes into medical devices. Unfortunately, we as a society have grown accustomed to software products that are constantly patched and updated. But we don’t put up with that in any other field. If you had to patch the software on your toaster on a daily basis you’d be justifiably upset.

But industry habits die hard. There’s the perennial driver of speed-to-market that works against best security practices. And some device makers use legacy software that is inexpensive, and vulnerable. Yet companies that adopt best practices for security can use that as a differentiator to stand out among their competitors.

I’ve been asked whether publishing best practices for software development actually tips off hackers to how secure software gets written. Unfortunately, hackers already know this stuff. It’s time that actual software developers adopt practices that can thwart attacks, particularly in medical devices, that can do actual harm to real people.

Advertisement
Advertisement