Human machine interfaces enhance user experience through touchless technology

Discover the transformative shift in human-machine interaction as touchless technologies redefine how we interact with machines.

author avatar

26 Jul, 2024. 8 min read

The way we interact with machines has undergone a remarkable transformation. Mechanical buttons and levers once defined our control over devices. Then, touchscreens revolutionised the user experience, providing a direct and intuitive way of issuing commands. Today, we stand on the cusp of another major shift—the rise of touchless human-machine interfaces (HMIs). A human-machine interface is the platform through which humans interact with machines, systems, or devices.[1] HMIs translate complex data into a user-friendly format.

Why embrace this change? Touchless HMIs offer compelling advantages. They empower individuals with disabilities to interact with technology independently, opening doors to communication and control that were once inaccessible. Furthermore, they promote hygiene in public environments like hospitals and kiosks, minimising the spread of germs through shared touch surfaces.[2] Perhaps most importantly, these interfaces often provide a more natural and intuitive user experience – we instinctively communicate through speech and gestures, and these interfaces reflect those inherent patterns.

In this article, we will examine how touchless HMIs enhance the user experience, explore their applications in various sectors, and discuss their potential for the future of communication between humans and machines.

Human-machine interfaces began with the simple yet enduring pushbutton and dial. For decades, these physical controls were the primary means of directing machines. While reliable, they were often cumbersome and inefficient, especially when dealing with complex systems.[1] A factory floor, for instance, might require a vast array of buttons and dials to manage its machinery, adding complexity to the operator's task. 

The advent of Programmable Logic Controllers (PLCs) brought about a significant shift. PLCs are essentially industrial computers that automate tasks based on programmed logic.[3] While revolutionising automation, direct interaction with PLCs typically involved specialised interfaces, often text-based or requiring specific code knowledge. This created a barrier between operators and the machinery they supervised, slowing down configuration and troubleshooting. Compared to the intuitive operation of pushbuttons and dials, PLCs demanded a steeper learning curve for operators.

Touch panels provided a significant leap forward in HMI design. These intuitive interfaces replaced many physical controls with digital ones, offering greater flexibility and a more streamlined user experience. Operators could easily switch between control screens, monitor multiple processes at once, and access detailed diagnostics with the touch of a finger. However, touchscreens still carry limitations. They rely on direct physical contact, a hygiene concern in sensitive environments.[4] They also demand a user's proximity to the device.

As touch panels matured, voice control emerged as the next frontier in HMI development. Voice-enabled interfaces allow users to issue commands and control devices hands-free. This Touchless HMI technology is transformative in various industries. A surgeon, for instance, could access patient records or adjust medical equipment during a procedure without breaking sterility. Factory workers might remotely control machinery using voice commands, enhancing productivity and safety. Moreover, voice control opens new avenues for remote monitoring and managing industrial processes. Engineers can troubleshoot issues, receive alerts, and make adjustments from anywhere, leading to faster response times and less reliance on on-site personnel. 

Touchless HMIs seem like magic, but they rely on sophisticated technologies to bridge the gap between humans and machines. Voice control systems use natural language processing (NLP) to understand and respond to our spoken words.[5] Specialised devices track and interpret hand gestures, translating motions into commands. Meanwhile, eye-tracking systems monitor our gaze, allowing us to select items, navigate menus, or even scroll through documents simply by shifting our focus. These are just the beginnings of touchless control, and the potential for new and innovative interaction methods is as vast as our imaginations.

Smart Homes: Control lighting, temperature, appliances with voice or gestures

These human-machine interfaces transform the way we interact with our homes. We speak a  simple command to adjust the lights, change the temperature, or play our favourite music. We wave our hand to pause the smart TV or draw a gesture in the air to close our blinds. Wall-mounted interfaces, often found in high-end systems, provide comprehensive home control with visual feedback and easily complement voice and gesture commands. 

Smart home technology goes beyond individual commands. Location tracking can trigger actions automatically, like turning lights on when you arrive home or locking doors when you leave. Create personalised, pre-programmed sequences - like a "Goodnight" scene that automatically dims lights, lowers the thermostat, and arms the security system, all activated by a simple voice request or a quick gesture. Such human-machine interfaces bring convenience, flexibility, and even a touch of magic to our living spaces. 

Cars: HMIs that minimise distractions while driving (voice, gesture-based)

These interfaces promise to make our interactions with vehicles safer and more intuitive. Head-up displays (HUDs) exemplify this by directly projecting critical driving information (speed, navigation cues) onto the windshield.[6] This allows drivers to focus on the road ahead, minimising distractions inherent in glancing down at a traditional dashboard. Additionally, voice control and gesture recognition systems enable drivers to adjust climate settings and music or answer calls without needing to take their hands off the wheel.


A heads-up navigation display of a car showing directions, speed, and time. Image credit: Freepikrawpixel.com

Augmented reality (AR) takes the concept of HUDs even further. In AR-powered HUDs, a transparent surface called a combiner acts as a display. Information is overlaid directly onto the driver's field of view, merging with the real world. Imagine navigational arrows visually guiding you through turns or hazard warnings highlighting obstacles on the road. AR systems have vast potential for improving drivers' situational awareness and decision-making. The field of view (FOV) of an AR display is crucial, as it defines the size of the area where information can be projected.

Healthcare: Assistive HMIs for individuals with limited mobility

Touchless HMIs offer incredible potential for those with disabilities that affect their motor function. Voice and gesture-based interfaces can empower individuals with limited mobility to operate once inaccessible devices. For instance, a patient with paralysis might control an intelligent bed to adjust its position, call for assistance through a voice-powered home system, or even operate a computer using eye-tracking technology.

HMIs on some kidney dialysis machines display critical patient information, such as blood pressure and flow rates, while allowing technicians to adjust treatment parameters accurately. MRI scanners rely on sophisticated HMIs that guide the operator through the scanning process, enabling image selection and controlling imaging parameters.[2] Often cardiac defibrillators feature clear, intuitive HMIs that guide doctors through life-saving procedures, provide step-by-step instructions, and minimise the potential for errors in highly stressful emergencies

Beyond day-to-day activities, touchless HMIs open doors to communication, entertainment, and self-expression. Individuals with speech impairments could use text-to-speech systems controlled by gestures or eye movements, allowing them to interact with others. In both therapeutic and home settings, interfaces that respond to even minimal gestures can facilitate access to music, art creation tools, and games, promoting independence and improving quality of life. 

Applications of HMIs in Industrial Automation

HMIs integrate with key control systems such as Supervisory Control and Data Acquisition (SCADA), Enterprise Resource Planning (ERP), and Manufacturing Execution Systems (MES). This integration provides a centralised view and control point for complex manufacturing processes. Operators enjoy a visual representation of machinery status, production data, and resource management, empowering them to make informed decisions in real-time. 

By making process visualisation and control easy, HMIs contribute significantly to enhanced efficiency on the factory floor. Operators can quickly identify bottlenecks, fine-tune production parameters, and reduce downtime through prompt troubleshooting. The clear presentation of key performance indicators (KPIs) provided by modern interfaces drives data-backed optimisation at every stage of the manufacturing process. 

HMIs also enable remote monitoring and management. Engineers and technicians can access real-time data on production lines from anywhere, receive alerts on critical events, and take preventive measures to mitigate potential issues. This remote capability reduces the need for on-site personnel, especially in hazardous or remote locations, boosting productivity and optimising resource allocation. With secure, cloud-based interfaces, manufacturers can monitor and control operations across multiple facilities, facilitating centralised decision-making and accelerating response times.

The Benefits and Challenges

Touchless HMIs offer significant advantages. They transcend limitations for individuals with disabilities. Voice-controlled interfaces bridge communication gaps, while gesture or eye-tracking systems can facilitate control over personal devices and augmentative technologies. Beyond accessibility, they promote improved hygiene — a vital consideration in high-traffic public areas. With such technology, self-service kiosks at airports or restaurants can be operated without direct touch, minimising the spread of germs from many users.

However, challenges exist. Accuracy remains a concern, especially for voice control in noisy environments or for users with non-standard speech patterns. These HMIs collect vast amounts of data — voices, gestures, and eye movements — raising serious privacy considerations. Ensuring robust data protection and transparency on how information is used will be crucial for widespread adoption. Furthermore, for these interfaces to be truly effective, they must be compatible across various devices and platforms, as a lack of standardisation can be a barrier to integration.

Designing Interfaces for the Future

The future of HMI design goes beyond purely functional interfaces. Medical devices, in particular, demand HMIs that are intuitive, reliable, visually appealing, and even calming.  Smooth surfaces and clean lines can promote a sense of hygiene and sterility. At the same time, strategic use of colour and illumination in buttons and switches can quickly convey status or guide user action. Even in critical situations, well-designed HMIs should strive to reduce stress and facilitate quick, confident interactions.

While touchless technologies are exciting, tactile feedback remains indispensable in high-stakes environments like healthcare. The satisfying click of an illuminated pushbutton provides instant confirmation of an action. Emergency stop buttons must be instantly recognisable and easily actuated, providing fail-safe peace of mind during urgent situations.

Future HMIs must balance innovation and reliability, integrating cutting-edge technologies while preserving the essential elements that promote safety and an exceptional user experience. While still nascent, brain-computer interfaces (BCIs) are also on the horizon. Users will control devices with subtle thoughts, blurring the lines between humans and machines in unprecedented ways.[7]

References

  1. Papcun P, Kajáti E, Koziorek J. Human-machine interface in concept of industry 4.0. 2018 World Symposium on Digital Intelligence for Systems and Machines (DISA) 2018 Aug 23 (pp. 289-296). IEEE.

  2. Singh HP, Kumar P. Developments in the human-machine interface technologies and their applications: a review. Journal of medical engineering & technology. 2021 Oct 3;45(7):552-73.

  3. Alphonsus ER, Abdullah MO. A review of the applications of programmable logic controllers (PLCs). Renewable and Sustainable Energy Reviews. 2016 Jul 1;60:1185-205.

  4. Bhardwaj N, Khatri M, Bhardwaj SK, Sonne C, Deep A, Kim KH. A review on mobile phones as bacterial reservoirs in healthcare environments and potential device decontamination approaches. Environmental research. 2020 Jul 1;186:109569.

  5. Rani PJ, Bakthakumar J, Kumaar BP, Kumaar UP, Kumar S. Voice controlled home automation system using natural language processing (NLP) and internet of things (IoT). In2017 Third International Conference on Science Technology Engineering & Management (ICONSTEM) 2017 Mar 23 (pp. 368-373). IEEE.

  6. Ward NJ, Parkes A. Head-up displays and their automotive application: An overview of human factors issues affecting safety. Accident Analysis & Prevention. 1994 Dec 1;26(6):703-17.

  7. Zander TO, Gaertner M, Kothe C, Vilimek R. Combining eye gaze input with a brain-computer interface for touchless human-computer interaction. Intl. Journal of Human-Computer Interaction. 2010 Dec 30;27(1):38-51.