Empowering Independence: Comparing Eye-Gaze Interfaces for Assistive Robots

Research Digest

June 25, 2025

A participant wears Pupil Labs Neon eye tracking glasses while piloting a robotic arm using their gaze. Source: ICRA 2025 - Comparison of Three Gaze Control for Assistive Robots

A participant wears Pupil Labs Neon eye tracking glasses while piloting a robotic arm using their gaze. Source: ICRA 2025 - Comparison of Three Gaze Control for Assistive Robots

A participant wears Pupil Labs Neon eye tracking glasses while piloting a robotic arm using their gaze. Source: ICRA 2025 - Comparison of Three Gaze Control for Assistive Robots

How can eye-gaze control enhance autonomy for individuals with tetraplegia?

Tetraplegia - partial or complete paralysis of all four limbs - presents major challenges in daily life. Assistive robotic arms can support autonomy in essential tasks, but only if they can be controlled intuitively and reliably. Many existing control methods are either too complex, invasive, or not practical for daily use.

Because eye movements are typically preserved in individuals with tetraplegia, gaze-based control has emerged as a promising solution. However, there's still limited evidence on how different gaze-based interfaces perform in practice, especially when it comes to balancing precision, effort, and ease of use.

Examining eye-gaze behavior for real-world robotic control

To address this crucial gap, Emanuel Nunez Sardinha and colleagues tested three gaze-based strategies for robotic arm control. Using Pupil Labs Neon for eye tracking, they studied 33 able-bodied participants with restricted head movement to simulate tetraplegia-like conditions.

The study used a Jaco assistive robotic arm with three interface types based on their Diegetic Graphical User Interface (D-GUI) method:

Diagram illustrating the three gaze-based control interfaces tested in the study: A) Graphical User Interface (GUI), B) Embedded Interface (EI), and C) Directional Gaze (DG).

  • Graphical User Interface (GUI): A traditional setup where users interact with a separate control panel to command the robot.

  • Embedded Interface (EI): A novel design where controls are placed directly on the robot’s end effector, keeping gaze and action closely aligned.

  • Directional Gaze (DG): A minimalist method using directional eye movements (e.g., “look up” to move the robot up), with no buttons or menus.

Performance was assessed using a modified Yale-CMU-Berkeley (YCB) Block Pick and Place Protocol, measuring scores and completion times. User experience was evaluated through subjective questionnaires, including the System Usability Scale (SUS) and the NASA-Task Load Index (NASA-TLX), which measures workload. Objective gaze behavior metrics, such as saccades per minute and fixation duration, were also recorded.

Video demonstrating the experimental procedures adopted by Nunez Sardinha and colleagues.

Key Findings and Design Implications

This study offers encouraging insights into how different gaze-based interfaces can support intuitive, effective control of assistive robots, a critical step toward enhancing autonomy for individuals with limited mobility.

  • Visible interfaces improved precision. Interaction zones on a panel or robot led to more accurate task completion than directional gaze alone, showing clear targets aid control.

  • Controls embedded on the robot reduced effort. Direct interaction reduced eye movements, suggesting a more relaxed and focused experience with better gaze-action alignment.

  • Users found all methods usable - with caveats. Usability was similar overall, though users with vision impairments or glasses faced more challenges, stressing inclusive design.

  • Directional gaze was feasible but demanding. Simple eye pointing was intuitive, but lack of visible controls made precise actions harder and limited scanning comfort.

These findings highlight important considerations for future development: interfaces that keep controls visible, minimize context switching, and reduce cognitive load can make assistive technology more accessible and intuitive. While the Graphical User Interface was familiar and effective, the Embedded Interface approach stood out for its potential to create seamless, gaze-aligned control experiences. As this research moves toward trials with individuals with tetraplegia, it lays a strong foundation for designing assistive technologies that are not only functional but empowering.

Further Resources

How can eye-gaze control enhance autonomy for individuals with tetraplegia?

Tetraplegia - partial or complete paralysis of all four limbs - presents major challenges in daily life. Assistive robotic arms can support autonomy in essential tasks, but only if they can be controlled intuitively and reliably. Many existing control methods are either too complex, invasive, or not practical for daily use.

Because eye movements are typically preserved in individuals with tetraplegia, gaze-based control has emerged as a promising solution. However, there's still limited evidence on how different gaze-based interfaces perform in practice, especially when it comes to balancing precision, effort, and ease of use.

Examining eye-gaze behavior for real-world robotic control

To address this crucial gap, Emanuel Nunez Sardinha and colleagues tested three gaze-based strategies for robotic arm control. Using Pupil Labs Neon for eye tracking, they studied 33 able-bodied participants with restricted head movement to simulate tetraplegia-like conditions.

The study used a Jaco assistive robotic arm with three interface types based on their Diegetic Graphical User Interface (D-GUI) method:

Diagram illustrating the three gaze-based control interfaces tested in the study: A) Graphical User Interface (GUI), B) Embedded Interface (EI), and C) Directional Gaze (DG).

  • Graphical User Interface (GUI): A traditional setup where users interact with a separate control panel to command the robot.

  • Embedded Interface (EI): A novel design where controls are placed directly on the robot’s end effector, keeping gaze and action closely aligned.

  • Directional Gaze (DG): A minimalist method using directional eye movements (e.g., “look up” to move the robot up), with no buttons or menus.

Performance was assessed using a modified Yale-CMU-Berkeley (YCB) Block Pick and Place Protocol, measuring scores and completion times. User experience was evaluated through subjective questionnaires, including the System Usability Scale (SUS) and the NASA-Task Load Index (NASA-TLX), which measures workload. Objective gaze behavior metrics, such as saccades per minute and fixation duration, were also recorded.

Video demonstrating the experimental procedures adopted by Nunez Sardinha and colleagues.

Key Findings and Design Implications

This study offers encouraging insights into how different gaze-based interfaces can support intuitive, effective control of assistive robots, a critical step toward enhancing autonomy for individuals with limited mobility.

  • Visible interfaces improved precision. Interaction zones on a panel or robot led to more accurate task completion than directional gaze alone, showing clear targets aid control.

  • Controls embedded on the robot reduced effort. Direct interaction reduced eye movements, suggesting a more relaxed and focused experience with better gaze-action alignment.

  • Users found all methods usable - with caveats. Usability was similar overall, though users with vision impairments or glasses faced more challenges, stressing inclusive design.

  • Directional gaze was feasible but demanding. Simple eye pointing was intuitive, but lack of visible controls made precise actions harder and limited scanning comfort.

These findings highlight important considerations for future development: interfaces that keep controls visible, minimize context switching, and reduce cognitive load can make assistive technology more accessible and intuitive. While the Graphical User Interface was familiar and effective, the Embedded Interface approach stood out for its potential to create seamless, gaze-aligned control experiences. As this research moves toward trials with individuals with tetraplegia, it lays a strong foundation for designing assistive technologies that are not only functional but empowering.

Further Resources

How can eye-gaze control enhance autonomy for individuals with tetraplegia?

Tetraplegia - partial or complete paralysis of all four limbs - presents major challenges in daily life. Assistive robotic arms can support autonomy in essential tasks, but only if they can be controlled intuitively and reliably. Many existing control methods are either too complex, invasive, or not practical for daily use.

Because eye movements are typically preserved in individuals with tetraplegia, gaze-based control has emerged as a promising solution. However, there's still limited evidence on how different gaze-based interfaces perform in practice, especially when it comes to balancing precision, effort, and ease of use.

Examining eye-gaze behavior for real-world robotic control

To address this crucial gap, Emanuel Nunez Sardinha and colleagues tested three gaze-based strategies for robotic arm control. Using Pupil Labs Neon for eye tracking, they studied 33 able-bodied participants with restricted head movement to simulate tetraplegia-like conditions.

The study used a Jaco assistive robotic arm with three interface types based on their Diegetic Graphical User Interface (D-GUI) method:

Diagram illustrating the three gaze-based control interfaces tested in the study: A) Graphical User Interface (GUI), B) Embedded Interface (EI), and C) Directional Gaze (DG).

  • Graphical User Interface (GUI): A traditional setup where users interact with a separate control panel to command the robot.

  • Embedded Interface (EI): A novel design where controls are placed directly on the robot’s end effector, keeping gaze and action closely aligned.

  • Directional Gaze (DG): A minimalist method using directional eye movements (e.g., “look up” to move the robot up), with no buttons or menus.

Performance was assessed using a modified Yale-CMU-Berkeley (YCB) Block Pick and Place Protocol, measuring scores and completion times. User experience was evaluated through subjective questionnaires, including the System Usability Scale (SUS) and the NASA-Task Load Index (NASA-TLX), which measures workload. Objective gaze behavior metrics, such as saccades per minute and fixation duration, were also recorded.

Video demonstrating the experimental procedures adopted by Nunez Sardinha and colleagues.

Key Findings and Design Implications

This study offers encouraging insights into how different gaze-based interfaces can support intuitive, effective control of assistive robots, a critical step toward enhancing autonomy for individuals with limited mobility.

  • Visible interfaces improved precision. Interaction zones on a panel or robot led to more accurate task completion than directional gaze alone, showing clear targets aid control.

  • Controls embedded on the robot reduced effort. Direct interaction reduced eye movements, suggesting a more relaxed and focused experience with better gaze-action alignment.

  • Users found all methods usable - with caveats. Usability was similar overall, though users with vision impairments or glasses faced more challenges, stressing inclusive design.

  • Directional gaze was feasible but demanding. Simple eye pointing was intuitive, but lack of visible controls made precise actions harder and limited scanning comfort.

These findings highlight important considerations for future development: interfaces that keep controls visible, minimize context switching, and reduce cognitive load can make assistive technology more accessible and intuitive. While the Graphical User Interface was familiar and effective, the Embedded Interface approach stood out for its potential to create seamless, gaze-aligned control experiences. As this research moves toward trials with individuals with tetraplegia, it lays a strong foundation for designing assistive technologies that are not only functional but empowering.

Further Resources