Advertisement
Articles
Advertisement

Beyond the Ilities: What Makes Medical Technology Unique

Tue, 03/17/2009 - 11:14am
Rick Schrenker

LISTED UNDER:

An Editorial

Rick Schrenker is the systems engineering manager in the Massachusetts General Hospital Department of Biomedical Engineering. He can be reached at 617-726-7534 or raschrenker@partners.org.

By illuminating the significance of quality attributes like high reliability, accuracy, and durability for the portable medical device market, Mark Downey's May 2008 ECN article "Technology Convergence Influences Portable Medical Devices" provides needed perspective on some of the constraints of the field. But by barely touching on usability and safety, the article risks establishing an incomplete mental model of user needs in designers' minds, particularly for devices being developed for the elderly and infirm.

The increasing survival and therefore number of elderly does indeed pose a growing set of very difficult problems for an already stressed healthcare system. Technology will be a key component to addressing them. But I am always concerned when discussions about the application of technology in healthcare pay little or no attention to the real-world situational dynamics of the point of care. The statement "The portable medical device industry's requirements and expectations are similar to those of two other very different markets: consumer electronics and defense/military" can only be considered true in terms of a narrow subset of use case agnostic requirements. If clinical engineering's decades of experience in the acute setting are at all indicative, it will be crucial to the successful extension of medical technology beyond institutional walls that requirements and design engineers take more into account than physiology and software and hardware engineering.

Please bear with me as I take you Through-The-Looking-Glass into my world before returning to the topic at hand. A good starting point is the focal point: the point of delivery of care. Home, health center, or hospital, everything my colleagues and I do is intended to support the interactions between providers and recipients of care. Marilyn Sue Bogner's model of concentric subsystems focusing on the point of care not only depicts the infrastructure within which we work but also within which errors and hazards evolve. Its applicability across the spectrum of delivery sites can inform the design of system components, including medical or even personal health devices.1

Any medical device designer who has never toured a hospital needs to, and I'm sure a technician or engineer at your nearest will be happy to do so. Carrying Dr. Bogner's model with you, visit an ICU, and look in a room with a patient dependent on a ventilator to breathe, a patient monitor to acquire and display waveforms and vital signs, some number of infusion pumps delivering fluids and medications, and maybe even a dialysis machine or intraaortic balloon pump. Notice how the user interface of each is very different; including within various makes and models of infusion pumps. Notice too that if alarms illuminate and sound, they can be very different. For all intents and purposes, explicit systems integration doesn't exist. The same can be seen in the OR, ER, and everywhere else. While this may be the case in consumer electronics, I seriously doubt that the military would accept the medical industry's norm. Ditto aviation. Stepping back, look at what else and who else is in the room, what they are doing, and the interactions among all actors. If you can stay long enough, observe how the environment evolves with the patient's condition, what happens at change of shift, and what happens in an emergency.

In the regional healthcare system that my department supports, there are about 45,000 medical devices in use. Breaking these down, we manage over 3,800 models of equipment from over 800 manufacturers. There are almost 700 different types of devices represented. Every month, tens if not hundreds and sometimes thousands of new devices come in while others are retired. Managing the lifecycle support of our fleet has long been a challenge, the nature of which is changing profoundly with the inclusion of software-based devices. For instance, with software upgrades happening more frequently, it's becoming difficult to define what the lifetime of a device is anymore. Caregivers need to be trained more frequently, as do maintainers. Documentation needs to be updated. Very few of these changes occur in synchrony, which in turn means that the lifetime of a configured system of medical devices at any one bedside or in a care unit is even shorter. To me, this implies new attention must be directed to risk management, human factors engineering, and systems engineering. For instance, it may very well be time (past time, in my opinion) to consider standardizing medical device user interfaces.

Before leaving my world behind, I have to mention the newest wild card with which we have to deal, and that is the networking of medical devices to achieve communications interoperability. The stories describing the technical issues and human intrigues over the two decades it has taken to getting to this point are told elsewhere, but it is getting close to the day when many of the devices described earlier will no longer primarily be integrated in the mind of the clinical user but instead a background decision support system upon which the user will become increasingly reliant.

It would be reasonable at this point to ask whether the lack of systems integration presents a problem, given that hospitals seem to work. Setting aside the debate addressing this issue always ignites, there is an answer for the less volatile question, "Why do they work as well as they do?" Simply put, the physicians, nurses, and technologists who use these devices in order to care for patients traversing an often non-deterministic state space have got a great deal of experience doing so. They have cared for hundreds if not thousands of patients. They have learned the nuances and quirks of not just the devices but the artifacts of all the systems that focus in on the point of care, thereby gaining a sixth sense of what to trust and what to question. And yet, as every clinical engineer and biomedical equipment technician knows, nurses and physicians make mistakes in the use of medical devices. Those mistakes rarely carry serious consequences, but each event reminds us that they could.

Returning to the realm of portable medical devices used outside the hospital, one has to ask if medical experts who rely on devices throughout their careers make mistakes, why would we expect anything different of laypeople, particularly ones who are sick, and therefore possibly anxious, if not depressed? Even the term "worried well" carries with it the implication of anxiety. Cognitive issues associated with elderly self-care are well known.2 After you have taken the time to visit a hospital, visit a nursing home and an assisted living facility. Various degrees of memory impairment are common among these patients and residents. And the idea that technology can enable more of these people to stay home begs the questions, "How will they do?" and "How safe will they be with less direct human contact and observation?" How will the cognitively impaired recognize and react to anomalous behavior on the part of one or more devices? How will they recognize and react to a failure mode that would cause a professional to question a reading or interpretation?

Consider that not all of the devices being developed for the home are passive monitors; among the devices taking a more active role are medication monitors that include reminding capabilities. Passive versions of these have been around for years; active versions, both local and networked, are emerging. Networked systems provide the capability for oversight by a more cognitively healthy caregiver and bring other benefits as well. But networks add new risks as well—the potential for what Nancy Leveson calls "common-cause failures.3 A failure of the remote system affects all clients. If the client detects that failure, it can transition to a safe, network-independent state. But if the failure is so soft that it goes undetected locally, consequences could include the emergence of unanticipated hazards potentially affecting a number of cognitively impaired users and setting the stage for a new form of what Perrow termed a "system accident," in which numbers of people are harmed.4

None of this is intended to deter the reader from working in my domain but rather to increase your appreciation of its breadth. Know that the devices that are going into the home and fitness center are starting to communicate with the same hospital information systems as do the devices at our hospital bedsides. Know that the places where our patients receive care define the boundaries of our system, not walls.

Please keep in mind Henry Petroski's admonition:5

"Any design change... can introduce new failure modes or bring into play latent failure modes. Thus it follows that any design change, no matter how seemingly benign or beneficial, must be analyzed with the objectives of the original design in mind."

I will leave with pointers to books that speak to the challenges currently rocking my world and suggest that if you want to make products to help my colleagues and me, make the trip to a medical center and consider reading one or more of these (in addition to the ones cited previously):

•Bogner MS, ed., Human Error in Medicine, Lawrence Erlbaum Associates, 1994.

•Gosbee J and Gosbee L, eds., Using Human Factors Engineering to Improve Patient Safety, JCAHO, 2005.

•Vicente K, The Human Factor, Routledge, 2006.

Informative websites include:

•Association for the Advancement of Medical Instrumentation: www.aami.org

•Continua Alliance: www.continuaalliance.org

•American College of Clinical Engineering: www.accenet.org

•CE IT Community: www.ceitcollaboration.org

•MD PnP: www.mdpnp.org

References
1 Bogner MS, Misadventures in Health Care, Lawrence Erlbaum Associates, 2004.

2 Park D and Skurnik I, "Aging, Cognition, and Patient Errors in Following Medical Instructions", in Bogner MS, ed., Misadventures in Health Care – Inside Stories, Lawrence Erlbaum Associates, 2004.

3 Leveson N, Safeware, Addison-Wesley, 1995, p 57.

4 Perrow C, Normal Accidents, Princeton University Press, 1999, pp 62 – 100.

5 Petroski H, Design Paradigms: Cases of Error and Judgment in Engineering, Cambridge Univ. Press, 1994, p. 57.

Topics

Advertisement

Share this Story

X
You may login with either your assigned username or your e-mail address.
The password field is case sensitive.
Loading