How Experts See the Game: Eye Tracking and AI in Live Basketball

Research Digest

August 13, 2025

Figure: A frame from the Neon scene camera during a live basketball game. Red boxes show player detections made by the YOLOv8N model, while the blue dot marks the referee’s gaze point as they focus on the game action. Source: Lozzi, D.; Di Pompeo, I.; Marcaccio, M.; Alemanno, M.; Krüger, M.; Curcio, G.; Migliore, S. AI-Powered Analysis of Eye Tracker Data in Basketball Game. Sensors 2025, 25, 3572. https://doi.org/10.3390/s25113572

Figure: A frame from the Neon scene camera during a live basketball game. Red boxes show player detections made by the YOLOv8N model, while the blue dot marks the referee’s gaze point as they focus on the game action. Source: Lozzi, D.; Di Pompeo, I.; Marcaccio, M.; Alemanno, M.; Krüger, M.; Curcio, G.; Migliore, S. AI-Powered Analysis of Eye Tracker Data in Basketball Game. Sensors 2025, 25, 3572. https://doi.org/10.3390/s25113572

Figure: A frame from the Neon scene camera during a live basketball game. Red boxes show player detections made by the YOLOv8N model, while the blue dot marks the referee’s gaze point as they focus on the game action. Source: Lozzi, D.; Di Pompeo, I.; Marcaccio, M.; Alemanno, M.; Krüger, M.; Curcio, G.; Migliore, S. AI-Powered Analysis of Eye Tracker Data in Basketball Game. Sensors 2025, 25, 3572. https://doi.org/10.3390/s25113572

Studying how experts visually engage with their environment is crucial for understanding high-level performance and decision-making, particularly in fast-paced, dynamic contexts like professional sports. In basketball, coaches and referees make rapid decisions based on complex visual stimuli, making their visual attention vital for optimizing performance and training.

The Challenge of Capturing Real-World Visual Data

Most eye tracking studies in sports have been limited to lab settings or controlled simulations. These can’t fully replicate the chaotic, high-pressure conditions of live games, like unpredictable ball movement, crowd noise, or strategic shifts. As a result, researchers have lacked the tools to measure how experts really see and think in the moment.

A New AI-Powered Approach for Live Basketball Analysis

To overcome these limitations, Daniele Lozzi and colleagues have developed a new system that uses wearable eye tracking and computer vision algorithms to analyze real-world visual attention in live basketball games. The system uses Pupil Labs Neon to collect gaze data during live games.

Two AI models process the scene in real time:

  • YOLOv8N – for detecting players and objects in the scene.

  • SegFormer – for identifying uniform colors to determine team affiliation.

Using these tools, the system defines Areas of Interest (AOIs) and matches them to where the participant is looking. It then extracts features like gaze count and pupil diameter, which is a known proxy for cognitive load.

Preliminary Findings: Coaches and Referees See the Game Differently

Initial tests with referees and coaches during live games revealed promising patterns:

  • Referees: Showed larger pupil diameters when observing the defending team, suggesting higher cognitive load when evaluating defensive actions.

  • Coach Team A (Offense): Focused more on their own players during offensive phases, showing heightened visual demand.

  • Coach Team B (Defense): Exhibited greater pupil dilation and gaze count during defensive plays, indicating more mental effort when under pressure.

  • Overall: Attention strategies varied significantly by role and game context, revealing clear contrasts in how each expert engaged with play.

Why It Matters

This research marks a significant step forward in sports psychology and performance analysis, showcasing how wearable eye tracking technology can offer valuable insights into the cognitive processes of coaches and referees during live games. By moving beyond the limitations of controlled lab settings, this approach allows researchers to capture authentic visual attention data in the heat of real-world action, bringing us closer to understanding how experts perceive, evaluate, and respond under pressure.

Understanding these visual strategies and patterns of cognitive load could inform better coaching practices, enhance referee training, and support the development of tools that aid fast, accurate decision-making on the court.

Looking Ahead

Building on these early results, the team plans to introduce automatic scene classification to streamline labeling of key game events. In the longer term, they envision expanding the system with additional sensor inputs to create a richer, more multidimensional picture of human behavior, evolving into a kind of “cognitive radar” for live sports, capturing not just where experts look, but how they think on the fly.

Further Resources

Full article: https://pmc.ncbi.nlm.nih.gov/articles/PMC12158319/
Github repository: https://github.com/danielelozzi/speed
Research Center: Cognitive and Behavioral Science Lab (LabSCoC), University of L'Aquila, L’Aquila, Italy

Studying how experts visually engage with their environment is crucial for understanding high-level performance and decision-making, particularly in fast-paced, dynamic contexts like professional sports. In basketball, coaches and referees make rapid decisions based on complex visual stimuli, making their visual attention vital for optimizing performance and training.

The Challenge of Capturing Real-World Visual Data

Most eye tracking studies in sports have been limited to lab settings or controlled simulations. These can’t fully replicate the chaotic, high-pressure conditions of live games, like unpredictable ball movement, crowd noise, or strategic shifts. As a result, researchers have lacked the tools to measure how experts really see and think in the moment.

A New AI-Powered Approach for Live Basketball Analysis

To overcome these limitations, Daniele Lozzi and colleagues have developed a new system that uses wearable eye tracking and computer vision algorithms to analyze real-world visual attention in live basketball games. The system uses Pupil Labs Neon to collect gaze data during live games.

Two AI models process the scene in real time:

  • YOLOv8N – for detecting players and objects in the scene.

  • SegFormer – for identifying uniform colors to determine team affiliation.

Using these tools, the system defines Areas of Interest (AOIs) and matches them to where the participant is looking. It then extracts features like gaze count and pupil diameter, which is a known proxy for cognitive load.

Preliminary Findings: Coaches and Referees See the Game Differently

Initial tests with referees and coaches during live games revealed promising patterns:

  • Referees: Showed larger pupil diameters when observing the defending team, suggesting higher cognitive load when evaluating defensive actions.

  • Coach Team A (Offense): Focused more on their own players during offensive phases, showing heightened visual demand.

  • Coach Team B (Defense): Exhibited greater pupil dilation and gaze count during defensive plays, indicating more mental effort when under pressure.

  • Overall: Attention strategies varied significantly by role and game context, revealing clear contrasts in how each expert engaged with play.

Why It Matters

This research marks a significant step forward in sports psychology and performance analysis, showcasing how wearable eye tracking technology can offer valuable insights into the cognitive processes of coaches and referees during live games. By moving beyond the limitations of controlled lab settings, this approach allows researchers to capture authentic visual attention data in the heat of real-world action, bringing us closer to understanding how experts perceive, evaluate, and respond under pressure.

Understanding these visual strategies and patterns of cognitive load could inform better coaching practices, enhance referee training, and support the development of tools that aid fast, accurate decision-making on the court.

Looking Ahead

Building on these early results, the team plans to introduce automatic scene classification to streamline labeling of key game events. In the longer term, they envision expanding the system with additional sensor inputs to create a richer, more multidimensional picture of human behavior, evolving into a kind of “cognitive radar” for live sports, capturing not just where experts look, but how they think on the fly.

Further Resources

Full article: https://pmc.ncbi.nlm.nih.gov/articles/PMC12158319/
Github repository: https://github.com/danielelozzi/speed
Research Center: Cognitive and Behavioral Science Lab (LabSCoC), University of L'Aquila, L’Aquila, Italy

Studying how experts visually engage with their environment is crucial for understanding high-level performance and decision-making, particularly in fast-paced, dynamic contexts like professional sports. In basketball, coaches and referees make rapid decisions based on complex visual stimuli, making their visual attention vital for optimizing performance and training.

The Challenge of Capturing Real-World Visual Data

Most eye tracking studies in sports have been limited to lab settings or controlled simulations. These can’t fully replicate the chaotic, high-pressure conditions of live games, like unpredictable ball movement, crowd noise, or strategic shifts. As a result, researchers have lacked the tools to measure how experts really see and think in the moment.

A New AI-Powered Approach for Live Basketball Analysis

To overcome these limitations, Daniele Lozzi and colleagues have developed a new system that uses wearable eye tracking and computer vision algorithms to analyze real-world visual attention in live basketball games. The system uses Pupil Labs Neon to collect gaze data during live games.

Two AI models process the scene in real time:

  • YOLOv8N – for detecting players and objects in the scene.

  • SegFormer – for identifying uniform colors to determine team affiliation.

Using these tools, the system defines Areas of Interest (AOIs) and matches them to where the participant is looking. It then extracts features like gaze count and pupil diameter, which is a known proxy for cognitive load.

Preliminary Findings: Coaches and Referees See the Game Differently

Initial tests with referees and coaches during live games revealed promising patterns:

  • Referees: Showed larger pupil diameters when observing the defending team, suggesting higher cognitive load when evaluating defensive actions.

  • Coach Team A (Offense): Focused more on their own players during offensive phases, showing heightened visual demand.

  • Coach Team B (Defense): Exhibited greater pupil dilation and gaze count during defensive plays, indicating more mental effort when under pressure.

  • Overall: Attention strategies varied significantly by role and game context, revealing clear contrasts in how each expert engaged with play.

Why It Matters

This research marks a significant step forward in sports psychology and performance analysis, showcasing how wearable eye tracking technology can offer valuable insights into the cognitive processes of coaches and referees during live games. By moving beyond the limitations of controlled lab settings, this approach allows researchers to capture authentic visual attention data in the heat of real-world action, bringing us closer to understanding how experts perceive, evaluate, and respond under pressure.

Understanding these visual strategies and patterns of cognitive load could inform better coaching practices, enhance referee training, and support the development of tools that aid fast, accurate decision-making on the court.

Looking Ahead

Building on these early results, the team plans to introduce automatic scene classification to streamline labeling of key game events. In the longer term, they envision expanding the system with additional sensor inputs to create a richer, more multidimensional picture of human behavior, evolving into a kind of “cognitive radar” for live sports, capturing not just where experts look, but how they think on the fly.

Further Resources

Full article: https://pmc.ncbi.nlm.nih.gov/articles/PMC12158319/
Github repository: https://github.com/danielelozzi/speed
Research Center: Cognitive and Behavioral Science Lab (LabSCoC), University of L'Aquila, L’Aquila, Italy