Medical devices are increasingly dependent on software, evolving from the use of a simple two-transistor circuit for early artificial cardiac pacemakers to sophisticated modern systems supporting infusion pumps, electrocardiogram analysis, and image-guided surgery. Even relatively simple implantable medical devices rely more and more on software to perform life-sustaining functions like cardiac pacing, drug delivery and insulin administration. To put the importance of software to the medical device industry in perspective, consider that the amount of software in medical devices is doubling every two years.
Given the critical roles these devices perform, medical device designers need to ensure that they operate reliably. With software playing such a large and increasing role in their functionality, that reliability often hinges on the quality of the software—including that produced by and received from third party suppliers.
There are many considerations when it comes to the quality of that software. It obviously needs to reliably perform the duties for which it was designed. It needs to be resilient in the face of often hostile operating environments, such as inside the body or in areas like hospitals with abundant radio frequency noise. It needs to tolerate external influences, accidental and malicious. Because these devices are increasingly deployed as part of larger systems like those for telemetry and patient monitoring, they also need to be secure from attackers wishing to gain unauthorized access to device control channels, protected patient data and other parts of the system.
If you think these concerns are the realm of science fiction, think again. At the 2011 Hacker Halted conference, the recently deceased security researcher Barnaby Jack demonstrated remote compromise of an insulin pump to release a dose that would have been fatal in a real patient. His approach could identify and compromise implanted insulin pumps within a 300-meter range simply by scanning radio frequencies with a special antenna. Jack has since demonstrated that many pacemakers can be similarly controlled from a laptop, and remotely commanded to deliver an 830-volt shock to the owner. His analysis shows that these are both possible due to software flaws in the devices themselves.
Ultimately, the quality of a device’s software is based on the responsibility of the device manufacturer. Some of the considerations have obvious implementations, such as producing and running test cases that verify correct behavior. There are many corner cases that must be considered, and the FDA requires documentation that appropriate care was taken during development. Fortunately, the FDA also provides some guidance to help manufacturers.
One recommendation, which is increasingly viewed as a requirement, is to use static code analysis tools in the development and/or verification process. These tools detect problems like buffer overflows, memory leaks, race conditions and null pointer dereferences that can lead to incorrect behavior and device failures.
With devices increasingly communicating over wireless protocols, the FDA also recommends that designers thoroughly assess the risk of building wireless technology into devices before it is implemented. Provided that use of the technology provides a compelling benefit, the FDA suggests that medical device manufacturers implement the following security control methods for their devices:
- Limit access to trusted users via authentication mechanisms such as ID, password, smartcard, or biometrics.
- Use encryption to ensure secure data transfer to and from the device.
- Implement fail-safe, secure device features that protect critical functionality and deploy features that allow organizations to recognize, log and act upon security compromises.
That list doesn’t even touch on application-level security issues that can be utilized by an attacker to gain control of the device or system. The bottom line is that security is difficult and it often consists of a significant amount of thought and code to get right. But you cannot compromise it in the name of space or other constraints. If it is worth adding these capabilities, you have an obligation to implement them correctly.
In addition to delivering reliable, relevant devices, many medical device manufacturers are accountable to shareholders and subject to other business demands. The quality and security standards of their products cannot be compromised, of course, but there still exists a responsibility to develop those products efficiently and responsibly. Fortunately, the same tools that improve quality and security can also improve efficiency—if you use them at the right time or in the right place.
According to the National Institute of Standards and Technology (NIST), the annual cost due to poor software quality in the U.S. is $60 billion. 80 percent of software development budgets are dedicated to fixing software defects found late in the development cycle, which typically occurs during the Quality Assurance (QA) phase of the Software Development Lifecycle (SDLC). A defect found in QA costs 10 times as much to fix as it would cost when found earlier in the development cycle. What’s more, a defect found post-release is 30 times the cost of finding and fixing it during the coding, or implementation, phase of the SDLC.
Static code analysis verifies the correctness of the software without executing the software. It can often be used before the software is complete, and enables developers to locate and fix defects early in the development process as the code is being written. Tools that understand the impact of code changes can be used to streamline automated testing runs by avoiding tests that are not impacted by recent code changes. Pulling these verification steps into the inner development loop—what many term as “development testing”—enables designers to find and fix software issues while coding, when they are least expensive and easiest to fix.
Software enables medical device manufacturers to deliver innovative devices that help patients lead better, healthier lives. Medical device manufacturers must overcome the challenge of managing the risk of failure inherent in software. As the amount of software used in these medical devices increases, development testing offers a solution to aid with the software validation process. Medical device manufacturers should understand the tools that can be used to test often and early in development, identifying and fixing software problems, simultaneously improving quality, security, and efficiency. Using these tools helps leading medical device manufacturers save thousands of hours from their development effort, enabling them to get better, more secure products to market faster.