Taking Flight: Decoding Pilot Cognition in the Real World

Research Digest

December 18, 2025

The Limits of the Simulator

Video was recorded with a Pupil Labs eye tracker. Video is not directly related with this research, but for visual purposes.

Aviation safety has seen dramatic improvements over the years, yet human factors, such as workload, stress, and loss of situational awareness, remain a primary cause in 70–80% of accidents. Traditionally, researchers have relied on Flight Simulation Training Devices to study these phenomena. While simulators provide a safe and controlled environment, they cannot fully reproduce the psychological realism or consequences of actual flight. This reality gap can alter pilot behavior, ultimately limiting our ability to understand true cognitive demands during critical moments.

Bringing the Lab into the Cockpit

To address this gap, Rongbing Xu and colleagues at the University of Waterloo developed a standardized methodology for collecting high-quality multimodal data during real-world general aviation flights. By equipping a Cessna 172 with a suite of consumer-grade wearable sensors, the team successfully captured the physiological and cognitive states of pilots in operational conditions. This represents a significant leap forward, moving research out of the lab and into the skies.

A Multimodal Sensor Suite

The study utilized a comprehensive sensor ecosystem to generate a holistic view of pilot performance. The methodology synchronizes four distinct data streams:

  • Eye Tracking: Neon eye tracking glasses recorded gaze, pupil diameter, and blink rates at 200 Hz, revealing visual scan patterns and cognitive load.

  • Physiological Signals: A Muse S headband (EEG), Polar H10 strap (ECG), and EmbracePlus wristband (electrodermal activity and skin temperature) provided continuous measures of stress and workload.

  • Flight Telemetry: Sentry Plus ADS-B receivers logged altitude, speed, pitch, and other aircraft parameters.

  • Expert Assessment & Self-Reports: A Certified Flight Instructor (CFI) rated maneuvers and ensured safety, while pilots provided brief post-maneuver self-reports of workload, stress, and situation awareness.

Key Insights and Feasibility

The protocol was tested with 25 pilots, ranging from students to licensed aviators, across a range of real flight scenarios, including take-offs, steep turns, stalls, and landings. The study yielded 20 complete datasets, confirming that high-fidelity data collection is possible in a confined cockpit without compromising safety.

  • Operational Feasibility: Pilots adapted well to the sensor setup, reporting little discomfort or distraction, proving the sensor setup is unobtrusive.

  • Ecological Validity: Collecting data in real flight captures genuine cognitive stressors, nuances often lost in simulation, and supports development of machine learning models to detect overload or cognitive incapacitation.

  • Data Synchronization: Physiological, gaze, and flight telemetry data were successfully aligned using timestamps and Neon scene video.

Toward Smarter Training and Safer Cockpits

This methodology lays the foundation for a new era of aviation human factors research, providing unprecedented insight into how pilots respond to operational demands by measuring workload, stress, attention, and other cognitive states in real flight conditions. These insights enable data-driven performance assessments, helping instructors identify areas where pilots may need additional support or targeted training.

The approach also allows for personalized pilot training programs tailored to individual cognitive and physiological responses, rather than relying solely on standardized procedures. Additionally, the rich multimodal dataset can inform the design of human-centered cockpit interfaces, such as adaptive displays or alert systems, that respond to a pilot’s current cognitive state.

Overall, this methodology supports the development of safer, more effective training and operational protocols, ultimately enhancing aviation safety.

Further Resources

The Limits of the Simulator

Video was recorded with a Pupil Labs eye tracker. Video is not directly related with this research, but for visual purposes.

Aviation safety has seen dramatic improvements over the years, yet human factors, such as workload, stress, and loss of situational awareness, remain a primary cause in 70–80% of accidents. Traditionally, researchers have relied on Flight Simulation Training Devices to study these phenomena. While simulators provide a safe and controlled environment, they cannot fully reproduce the psychological realism or consequences of actual flight. This reality gap can alter pilot behavior, ultimately limiting our ability to understand true cognitive demands during critical moments.

Bringing the Lab into the Cockpit

To address this gap, Rongbing Xu and colleagues at the University of Waterloo developed a standardized methodology for collecting high-quality multimodal data during real-world general aviation flights. By equipping a Cessna 172 with a suite of consumer-grade wearable sensors, the team successfully captured the physiological and cognitive states of pilots in operational conditions. This represents a significant leap forward, moving research out of the lab and into the skies.

A Multimodal Sensor Suite

The study utilized a comprehensive sensor ecosystem to generate a holistic view of pilot performance. The methodology synchronizes four distinct data streams:

  • Eye Tracking: Neon eye tracking glasses recorded gaze, pupil diameter, and blink rates at 200 Hz, revealing visual scan patterns and cognitive load.

  • Physiological Signals: A Muse S headband (EEG), Polar H10 strap (ECG), and EmbracePlus wristband (electrodermal activity and skin temperature) provided continuous measures of stress and workload.

  • Flight Telemetry: Sentry Plus ADS-B receivers logged altitude, speed, pitch, and other aircraft parameters.

  • Expert Assessment & Self-Reports: A Certified Flight Instructor (CFI) rated maneuvers and ensured safety, while pilots provided brief post-maneuver self-reports of workload, stress, and situation awareness.

Key Insights and Feasibility

The protocol was tested with 25 pilots, ranging from students to licensed aviators, across a range of real flight scenarios, including take-offs, steep turns, stalls, and landings. The study yielded 20 complete datasets, confirming that high-fidelity data collection is possible in a confined cockpit without compromising safety.

  • Operational Feasibility: Pilots adapted well to the sensor setup, reporting little discomfort or distraction, proving the sensor setup is unobtrusive.

  • Ecological Validity: Collecting data in real flight captures genuine cognitive stressors, nuances often lost in simulation, and supports development of machine learning models to detect overload or cognitive incapacitation.

  • Data Synchronization: Physiological, gaze, and flight telemetry data were successfully aligned using timestamps and Neon scene video.

Toward Smarter Training and Safer Cockpits

This methodology lays the foundation for a new era of aviation human factors research, providing unprecedented insight into how pilots respond to operational demands by measuring workload, stress, attention, and other cognitive states in real flight conditions. These insights enable data-driven performance assessments, helping instructors identify areas where pilots may need additional support or targeted training.

The approach also allows for personalized pilot training programs tailored to individual cognitive and physiological responses, rather than relying solely on standardized procedures. Additionally, the rich multimodal dataset can inform the design of human-centered cockpit interfaces, such as adaptive displays or alert systems, that respond to a pilot’s current cognitive state.

Overall, this methodology supports the development of safer, more effective training and operational protocols, ultimately enhancing aviation safety.

Further Resources

The Limits of the Simulator

Video was recorded with a Pupil Labs eye tracker. Video is not directly related with this research, but for visual purposes.

Aviation safety has seen dramatic improvements over the years, yet human factors, such as workload, stress, and loss of situational awareness, remain a primary cause in 70–80% of accidents. Traditionally, researchers have relied on Flight Simulation Training Devices to study these phenomena. While simulators provide a safe and controlled environment, they cannot fully reproduce the psychological realism or consequences of actual flight. This reality gap can alter pilot behavior, ultimately limiting our ability to understand true cognitive demands during critical moments.

Bringing the Lab into the Cockpit

To address this gap, Rongbing Xu and colleagues at the University of Waterloo developed a standardized methodology for collecting high-quality multimodal data during real-world general aviation flights. By equipping a Cessna 172 with a suite of consumer-grade wearable sensors, the team successfully captured the physiological and cognitive states of pilots in operational conditions. This represents a significant leap forward, moving research out of the lab and into the skies.

A Multimodal Sensor Suite

The study utilized a comprehensive sensor ecosystem to generate a holistic view of pilot performance. The methodology synchronizes four distinct data streams:

  • Eye Tracking: Neon eye tracking glasses recorded gaze, pupil diameter, and blink rates at 200 Hz, revealing visual scan patterns and cognitive load.

  • Physiological Signals: A Muse S headband (EEG), Polar H10 strap (ECG), and EmbracePlus wristband (electrodermal activity and skin temperature) provided continuous measures of stress and workload.

  • Flight Telemetry: Sentry Plus ADS-B receivers logged altitude, speed, pitch, and other aircraft parameters.

  • Expert Assessment & Self-Reports: A Certified Flight Instructor (CFI) rated maneuvers and ensured safety, while pilots provided brief post-maneuver self-reports of workload, stress, and situation awareness.

Key Insights and Feasibility

The protocol was tested with 25 pilots, ranging from students to licensed aviators, across a range of real flight scenarios, including take-offs, steep turns, stalls, and landings. The study yielded 20 complete datasets, confirming that high-fidelity data collection is possible in a confined cockpit without compromising safety.

  • Operational Feasibility: Pilots adapted well to the sensor setup, reporting little discomfort or distraction, proving the sensor setup is unobtrusive.

  • Ecological Validity: Collecting data in real flight captures genuine cognitive stressors, nuances often lost in simulation, and supports development of machine learning models to detect overload or cognitive incapacitation.

  • Data Synchronization: Physiological, gaze, and flight telemetry data were successfully aligned using timestamps and Neon scene video.

Toward Smarter Training and Safer Cockpits

This methodology lays the foundation for a new era of aviation human factors research, providing unprecedented insight into how pilots respond to operational demands by measuring workload, stress, attention, and other cognitive states in real flight conditions. These insights enable data-driven performance assessments, helping instructors identify areas where pilots may need additional support or targeted training.

The approach also allows for personalized pilot training programs tailored to individual cognitive and physiological responses, rather than relying solely on standardized procedures. Additionally, the rich multimodal dataset can inform the design of human-centered cockpit interfaces, such as adaptive displays or alert systems, that respond to a pilot’s current cognitive state.

Overall, this methodology supports the development of safer, more effective training and operational protocols, ultimately enhancing aviation safety.

Further Resources