Proper design of the physical medical device itself is obviously important to its success. The design of the user interface for that device, however, can be equally critical. This article goes through the steps and considerations developers need to keep in mind when creating the visual interface for a medical device.
Jahnavi Lokre is the director of marketing for Aubrey Group Inc. She can be reached at 949-581-0188, x258 or
Sophisticated graphical user interfaces have become ubiquitous in most consumer devices today. The growth of rich graphical user interfaces (GUI) has been fuelled by powerful embedded microcontrollers with abundant memory, inexpensive and high-quality LCD and touchscreens, and small-footprint operating systems.

Medical devices are expected to have the same rich user interfaces that users have become accustomed to with these devices; manufacturers are increasingly incorporating color LCD and touchscreen interfaces into their medical devices.

Designers have all had their share of GUI design experiences where the GUI effort saw unplanned increases in scope and complexity, leading to cost and schedule overruns. The focus here is on the process that will ensure a cost-effective and user friendly interface.

Human Factors Engineering (HFE)

Human factors focuses on the role of humans in man-machine systems and how systems can be designed to work well with people–particularly with regard to safety and efficiency. Regulatory agencies, including the FDA, require medical device manufacturers (Figure 1) to demonstrate how human factors considerations were met during the product's development; a well-documented record can speed up the approval process.

HFE focuses on hazard scenarios related to device failures as well as use failures. It takes into account the environment in which the device will be used (light, noise, workload, other distractions), the abilities and limitations of the user, and device characteristics that play a part in the user interaction. Considering what the user sees, touches, or hears plays a significant role in creating a robust user interface experience and mitigating any hazards arising from user-device interaction.

As user interfaces become more sophisticated, the focus on errors and human usability issues becomes more critical. At a recent AAMI conference, an FDA spokesman commented that more than one-third medical device incident reports involved user error, and more than one-half recalled could be traced to user interface design.

Development Process

The GUI is only a part of the application software for the device and has to be integrated with the rest of the application, whether the device contains single or multiple processors (Figure 2). The GUI development process must include integration with the main application at the appropriate time.

The development of the interface typically follows an iterative software development process (Figure 3) within the HFE framework. It is important to consider risks and hazards throughout the development process.

Get Requirements From Key Stakeholders/Users

It cannot be emphasized enough how important it is to get input from the key stakeholders and users early. The key stakeholders are typically internal users–marketing, service, and manufacturing. The external users include the patient, nurse, and clinician. Focus groups, point-of-use interviews, and observations are another source of input at this stage. A review of similar systems also helps as guidance on what does and doesn't work. It is important to refer to any standards that regulate the user interfaces, such as IEC 60601-1-8 for alarm signals. Above all, the team involved in determining the requirements must have a thorough understanding of the device functionality and intended use. Requirements related to color depth and schemes, 2D or 3D graphics, and language localization need to be addressed as they affect decisions made in the following stages.

Design a Flexible User Interface

Designing the user interface with flexibility in mind will reap benefits as the product evolves and goes through the design iterations. The design has two components–conceptual and physical.

The conceptual design identifies the objects in the system, their properties, behaviors, and relationships of these objects with respect to a user's interaction with the system. An object-oriented design approach works well here. This is a good time to begin the involvement of the industrial design team. A storyboard showing the entire system operation is a typical output of this phase.

The physical design includes specifics about how an application will support the conceptual design; the type, size, and resolution of the display to be used; use of input devices, such as touchscreen, keypad, or mouse; the choice of the hardware and software platform; etc.

These design components are key to the device operation, so the earlier these decisions are made, the smoother the rest of the process.

Build a Prototype

A quick functional prototype starts the process rolling. The early prototype versions need not include the final graphics and color schemes; they can be added as the prototype evolves. Special attention needs to be given to screen space allocation, size of objects, and the use of multiple languages. The PC is a quick and efficient prototyping platform. There are several design tools that will allow developers to take their GUI design from the PC to the embedded platform seamlessly. In addition, if they are working with an embedded platform that they have not used before or are not sure of the performance that will be achieved, they should consider moving to the embedded platform sooner.

Evaluate the Prototype and Iterate as Required

Getting a functional prototype that qualified users can evaluate is critical to the design process. Users and stakeholders will inevitably change their minds as to how things should work after they have had a chance to work with something tangible. The purpose of this step is to get early feedback and allow for fundamental changes to be handled sooner, reducing design cost and effort.

This phase, along with the Requirements, Design, and Prototype phases, should be repeated until the major design features and graphics have been solidified; plan for about two to three iterations.

Test Refined Prototype for Usability

Usability testing should be performed with a larger set of users and focus groups. The testing will result in further refinement and one to two iterations of the previous phases.

Move on to the Final Design phase when the results of the usability testing indicate that the major components of the design are acceptable.

Integrate GUI With Application

At this stage, the GUI is ready to be integrated with the application and the embedded environment, and the final design that includes the design elements (such as graphics) that must be implemented on the embedded platform is polished. Special attention must be given to the characteristics of the embedded platform, such as user response times, speed of operation, brightness and visibility of the LCD, touchscreen sensitivity, etc.

Perform Acceptance Test

Final acceptance testing of the GUI should be performed, as part of the device, with all key stakeholders and users. Any anomalies observed during this phase should be corrected. Additionally, critical requirement and design changes should be incorporated. All other changes should be considered for future releases.

Track and Adapt to Changes in Environment

After the product has been deployed in the field, it is important to track the user interface related to technology changes, as well as user workflow changes. These can help extend the life of the user interface over the life of the product.


A structured process reduces product liability, improves customer value, shortens the time-to-market, while lowering development and training costs, and enhances the acceptance of the product.


For additional information on the technologies and products discussed in this article, see MDT online at or Aubrey Group at