Today’s intuitive touch screens and icons are minimizing training expenses and impacting the outcomes of procedures, as well as providing effective ways for differentiating devices in the marketplace and making product upgrades. And tomorrow’s interfaces will likely evolve from flat, 2D experiences to 3D or “spatial interfaces,” where the user interacts within a virtual space.
Flexible screens and increasingly portable devices made possible by lower power requirements will further change the way doctors, nurses, and patients interact with medical devices. The industry may even see some technologies from the Wii playbook make their way into the medical setting, where intuitive gesturing could be used to manipulate software.
User-interface designers are the software counterparts to industrial designers. That is, while industrial designers focus on the physical interaction and form of products, interface designers make the critical decisions about how information is organized, presented, and prioritized. With the increasing complexity of medical devices and the expanding role of software in driving their function, interface designers have become invaluable to the development of successful medical devices.
Meanwhile, great advancements in technology have paved the way for user-interface to play a vital role in new medical devices. Touch screens, improvements in resolution, and decreasing costs for such capabilities have set the stage for manufacturers to leverage user-interface as a primary way to improve both the user experience and competitive advantage offered by their products. User interaction has evolved beyond molded buttons and dials, and has taken tremendous steps forward in many ways.
Safety, for example, is always the most important consideration for product development in the medical industry. A robust user interface can be designed to be more intuitive than a series of buttons and levers could ever be, making many input errors caused by simple confusion a thing of the past. By integrating controls, feedback, and safety checks into the software and interface, human error such as delivery of incorrect medication dosage can be prevented. Similarly, highlighting important or time-sensitive data can allow a nurse or technician to efficiently track a patient’s real-time condition or identify signs that require further review by a physician that might otherwise be overlooked.
Designing with an integrated, software-based user-interface can also save redevelopment or refresh costs down the line. Rather than redesigning all of a device’s hardware, manufacturers now have the option of improving only the software and interface within a product’s existing form factor, simply swapping out old interface hardware for a new touchscreen and associated software. This has to be handled carefully from the regulatory standpoint, however, as redesigning legacy products can be a balancing act between permitted advancement and pushing a device into the requirement for a new FDA approval. With new displays now being built with brains (motherboards, processors, etc) that feature embedded software rather than building boards, housings, and buttons from the ground up, this will be an avenue more manufacturers consider in coming years.
Another advantage to software-based user interfaces is the training and instructional capabilities offered by a touch screen or graphics-oriented interface. It’s not uncommon to find post-it notes and pages torn from user manuals taped to devices to remind or instruct the operator about their operation – not the optimal way to ensure proper and safe use. Confusing interfaces can also squash efficiency and contribute to high training and tech support costs for both the manufacturer and owner. If investments are made early to develop an interface that actually offers meaningful guidance to the user, big dividends can be reaped in both purchaser satisfaction as well as minimization of front end training and downstream support expenses for all parties.
User-interface design is not just about what information is presented, when it’s shown, and how it looks – it’s also about choosing the appropriate method of delivery for the information within the context of where a device will be used, who will be using it, and the way(s) they are expected to interact with it. For example, when presented with a challenge to design the interface for a device used in an operating room, an understanding of what information needs to be efficiently and effectively disseminated (and to whom), what data has to be entered, the size of the device, whether it will be pole-mounted, if gloved hands will be touching it, and whether it might come in contact with various fluids or chemicals, are all important considerations.
A device required to disseminate vital information to the surgeon and nurses, for example, must be easily readable from across the room, and an LCD might not be bright enough to read at the specified distance. Since a VFD (vacuum florescent display) can be difficult to read at angles if the contrast is not bright, it’s essential to understand where those accessing the information will be standing throughout the procedure. And, since the people in the room will be wearing gloves, button size, orientation, spacing, etc. (and whether a touch screen is even appropriate for the tasks at hand) all must be considered.
Alternately, a device to be used and carried all day by a child would have entirely different requirements, including durability, ergonomic, and aesthetic concerns that are not relevant to the design of a device which never leaves a doctor’s office. Since just about everybody interacts with technology in their lives (be it a smartphone, computer, or gaming system) some of the functionality, aesthetics and methods of information delivery seen in those products can be successfully applied to such a device to make it more intuitive and familiar feeling. A device designed for a child might, for example, feature a simple “thumbs up” or “thumbs down” graphic instead of potentially confusing (and intimidating) numeric results to effectively communicate information to a younger user.
The importance of interface design will only continue to grow and become more central to device development as the emerging tele-health industry takes off; the benefits offered by wireless connectivity between and among medical devices will naturally drive increased functionality and interaction to the software side of product development. Designing to support electronic medical records and constant, real-time monitoring of myriad health and wellness metrics (to be shared among multiple people and institutions) will also present opportunities for device manufacturers.
With more tools at their fingertips, user-interface designers will have a unique opportunity to turn the world of medical devices on its head- and make interaction with both technology and caretakers a more personal experience that can ultimately aid in healing.