John Walmsley, Starfish Medical

Observing the Human Factor

By John Walmsley
John Walmsley, Starfish Medical

To truly understand device use scenario, one must observe with intention.

The FDA has defined a use error as “a situation in which the outcome of device use was different than intended, but not due to malfunction of the device. The error may have been due to a poorly designed device, or it may have been used in a situation that promoted incorrect usage.” Notice that there is no space in the definition for the error being the responsibility of the user. There is a long history of designers making assumptions and over-relying on a user’s capabilities and capacities. However, an inadequately designed device can hurt people.

Watching a scrub nurse gently prompt a world-renowned surgeon to put on his mask in the sterile field reminds the viewer that not everything happens the way in which a key opinion leader (KOL) describes it. The only way to really understand the actual use scenario for a medical device is to observe with intention. Understanding the scenarios in which a device will be used is critical to accounting for one of the trickier design challenges: The human factor.

A series of important steps can lay a sound foundation for subsequent design steps. Important findings uncovered at the earliest stages are both inexpensive to learn from and can influence thousands of design decisions to be made later in the design process.

To get a design right and minimize use error consequences, you need to understand who the user actually is—what their day is like and the other demands, conscious and unconscious, that they are under while operating the device. The actual user won’t likely be the KOL. And don’t forget the other human factor: The patient. Whether intended to be an unconscious surgical patient or a vibrant young toddler, the patient needs to be considered as part of the system design. No matter how experienced the development team, how adamant the KOL or how inconvenient the timing, physical observation by an experienced observer is vital and will provide critical information to the design and development of the new device.

Information gathering by observation begins with the lean principle of “go and see”. Standing in with the surgical team, observing the community clinic, and shadowing the ophthalmologist may require some effort and connections to arrange, but the reward is a much truer understanding of the use environment—both intended and incidental. It’s much easier to remember that your future users are human when you’ve watched a nurse following instructions to assemble a surgical tool for the first time while the surgeon waits with the patient’s head open. It’s much easier to remember that equipment is not at the center of a surgical team’s attention when you’ve noticed that electrical cables are inevitably run over, damaged and taped up; also that equipment hits doorways hard far more often than a designer would expect or hope.

The behavior of a user who is focused on the operation of new device is quite different from the behavior of a habituated user who is performing the operation in the midst of a busy day. Noticing and recording what is actually happening in addition to the intended workflow goes a long way to a design that accounts for the FDA’s “situation that promoted incorrect use”.

The FDA’s example of a problem description gives further insight into their thinking: “Nurse was changing the concentration of a prescribed medication being infused through a pump. In programming a bolus before the concentration change, she misunderstood the default settings and accepted the bolus concentration as the final dose. As a result, the patient received a three-fold overdose.”

How the data is synthesized isn’t critical, but it takes time and needs to be done in a way that captures all of the findings, including the unexpected edge cases. Including the observed users in the review of the findings often uncovers further nuance or, even, unexpected situations. It can be challenging, however, for users to imagine situations beyond their routine. It is even harder for many new users to imagine a novel concept without seeing or holding it.

How does a potential user see and hold a device that has not yet been created? One solution many developers take is to finish the complete device as fast as possible to get it into users’ hands. There are lots of good reasons for this but it is totally unnecessary to wait until completion to put something in a user’s hands.

While the ubiquitous nature of 3D printing has led many to leap to CAD to make the first formative models, there are many time-honored designer methods for placing a concept in a user’s hands. A mock up can be created straightforwardly with time-honored designer methods. Imaginative assembly of found materials, quickly carved foam or even bent-and-glued cardboard lend themselves to the making and adjusting of concepts. In the hands of an observant and creative human factors engineer or designer, the barely acknowledged needs and desires of a target user can be surfaced and examined, and first steps taken to define them.

During early analysis, it is important to try many options including some that are not expected to be successful. Careful observation of user interaction with the suggested form concept can lead to prompt iteration. If the learning and changes happen live in early formative study, progress can be better directed as the synergy of different options is realized. Asking a user to remember three iterations back is easier when it is still fresh in the brain and muscle memory.

I can think of at least a couple of examples in which it didn’t take the designer’s full powers of observation in formative testing to notice that most users engaged with a mock up either upside down or backwards. The cues and functionality that led to this were easily adjusted in a way that would be much harder if most design activities were completed before testing.

Faced with a completed device that the user needs to engage with differently than they expect can lead to slower adoption (business risk) or error (patient risk). Even if it is clear that using the device upside down is wrong, the cognitive load that the user needs to expend to overcome their expectation can result in other unrelated errors. These unrelated errors can be difficult to discover until they surface much too late when the device is in the field.

A design based on a sound understanding of the human factors will be much more robust in subsequent testing and in launch. In addition to avoiding errors, your users may love your device.

About The Author

John Walmsley, Starfish Medical