SocialEyes: Scaling Mobile Eye-tracking to Multi-person Social Settings

Research Digest

June 5, 2025

SocialEyes visualization showing eye tracking videos of audience participants with gaze overlay. Center image showing aggregate mapping of gaze for all 29 participants on a video of the performance. Source: "SocialEyes: Scaling Mobile Eye-tracking to Multi-person Social Settings" Saxena et al., CHI 2025. Used under BY-NC-SA 4.0 / Still frame taken from original video.

SocialEyes visualization showing eye tracking videos of audience participants with gaze overlay. Center image showing aggregate mapping of gaze for all 29 participants on a video of the performance. Source: "SocialEyes: Scaling Mobile Eye-tracking to Multi-person Social Settings" Saxena et al., CHI 2025. Used under BY-NC-SA 4.0 / Still frame taken from original video.

SocialEyes visualization showing eye tracking videos of audience participants with gaze overlay. Center image showing aggregate mapping of gaze for all 29 participants on a video of the performance. Source: "SocialEyes: Scaling Mobile Eye-tracking to Multi-person Social Settings" Saxena et al., CHI 2025. Used under BY-NC-SA 4.0 / Still frame taken from original video.

The Challenge

Understanding how people share attention in real-world settings—like concerts, classrooms, or collaborative work—has long been out of reach for eye-tracking research. Traditional tools are designed for one person at a time, in controlled labs. This leaves key questions about collective attention and shared gaze unanswered, because the technical hurdles of tracking many people at once “in the wild” have been too high.

A New Solution: SocialEyes

SocialEyes visualization showing eye tracking videos of audience participants with gaze overlay. Center image showing aggregate mapping of gaze for all 29 participants on a video of the performance. Source: "SocialEyes: Scaling Mobile Eye-tracking to Multi-person Social Settings" Saxena et al., CHI 2025. Used under BY-NC-SA 4.0

Researchers at McMaster University have created SocialEyes, a flexible software framework that when combined with Pupil Labs Neon, finally makes large-scale, multi-person eye-tracking possible in natural environments. Neon is lightweight, wearable, and less disruptive for participants. It works straight out of the box with no setup or calibration, accommodating diverse wearers (of all ages, ethnicities, and genders) and is robust enough for ‘in the wild’ research. This means participants can adjust the glasses, scratch their heads, or even take the glasses off and put them on again, all without needing to recalibrate. For the first time, this empowers researchers to conduct large-scale eye-tracking studies in truly natural settings!

How SocialEyes Works — In Brief

  • Remote Control: All eye-tracking devices are managed and monitored from a central interface, so researchers don’t need to intervene during events.

  • Precise Synchronization: The system keeps all devices’ clocks tightly aligned, ensuring that every moment of gaze data lines up across participants.

  • Pupil Labs Realtime API - Remote control, data streaming, and synchronization is facilitated via the Pupil Labs open-source realtime API over a local network.

  • Shared Viewpoint: SocialEyes maps each person’s gaze from their own perspective onto a common “central view”—like the stage at a concert—so researchers can see where everyone is looking together.

  • Collective Analysis: The framework offers powerful tools to visualize and analyze group attention, including heatmaps, gaze similarity, and measures of how gaze patterns shift over time.

Putting SocialEyes to the Test

SocialEyes was used in two large-scale events at McMaster’s LIVELab concert hall. Sixty participants wore Neon glasses while watching a live concert and a film screening. All devices were remotely managed, and the system collected synchronized gaze data from 30 people at once.

What Did We Learn?

  • It Scales: SocialEyes handled 30 Neon devices in parallel, with remote troubleshooting that didn’t interrupt the events.

  • Accurate, Synchronized Data: Gaze data from all participants could be precisely aligned in time and space, even in challenging conditions like motion or changing light.

  • Revealing Group Patterns:

    • During the concert, people’s gaze was more dispersed—likely tracking performers’ movements—while the film prompted more focused attention.

    • Where people were seated influenced where they looked.

    • Projecting individual gaze onto the shared view revealed much stronger patterns of collective attention than analyzing each person separately.

    • Gaze patterns changed over time, marking key moments like the start of the film or different sections of the performance.

    • The system could even track collective blinks, hinting at shared emotional or attentional states.

Why It Matters

SocialEyes breaks down the barriers to studying real-world, group attention. It brings eye-tracking research out of the lab and into everyday life, making it possible to explore how we share focus in social, educational, and entertainment settings. This opens new doors for understanding human behavior and designing interactive technology that responds to collective attention.

Open and Accessible

Both SocialEyes and the Pupil Labs Realtime API are open-source. All code and analysis scripts are available on their respective GitHub repositories:

It’s great to see users leveraging our tools to build and extend custom applications. Looking forward to more live performances!

The Challenge

Understanding how people share attention in real-world settings—like concerts, classrooms, or collaborative work—has long been out of reach for eye-tracking research. Traditional tools are designed for one person at a time, in controlled labs. This leaves key questions about collective attention and shared gaze unanswered, because the technical hurdles of tracking many people at once “in the wild” have been too high.

A New Solution: SocialEyes

SocialEyes visualization showing eye tracking videos of audience participants with gaze overlay. Center image showing aggregate mapping of gaze for all 29 participants on a video of the performance. Source: "SocialEyes: Scaling Mobile Eye-tracking to Multi-person Social Settings" Saxena et al., CHI 2025. Used under BY-NC-SA 4.0

Researchers at McMaster University have created SocialEyes, a flexible software framework that when combined with Pupil Labs Neon, finally makes large-scale, multi-person eye-tracking possible in natural environments. Neon is lightweight, wearable, and less disruptive for participants. It works straight out of the box with no setup or calibration, accommodating diverse wearers (of all ages, ethnicities, and genders) and is robust enough for ‘in the wild’ research. This means participants can adjust the glasses, scratch their heads, or even take the glasses off and put them on again, all without needing to recalibrate. For the first time, this empowers researchers to conduct large-scale eye-tracking studies in truly natural settings!

How SocialEyes Works — In Brief

  • Remote Control: All eye-tracking devices are managed and monitored from a central interface, so researchers don’t need to intervene during events.

  • Precise Synchronization: The system keeps all devices’ clocks tightly aligned, ensuring that every moment of gaze data lines up across participants.

  • Pupil Labs Realtime API - Remote control, data streaming, and synchronization is facilitated via the Pupil Labs open-source realtime API over a local network.

  • Shared Viewpoint: SocialEyes maps each person’s gaze from their own perspective onto a common “central view”—like the stage at a concert—so researchers can see where everyone is looking together.

  • Collective Analysis: The framework offers powerful tools to visualize and analyze group attention, including heatmaps, gaze similarity, and measures of how gaze patterns shift over time.

Putting SocialEyes to the Test

SocialEyes was used in two large-scale events at McMaster’s LIVELab concert hall. Sixty participants wore Neon glasses while watching a live concert and a film screening. All devices were remotely managed, and the system collected synchronized gaze data from 30 people at once.

What Did We Learn?

  • It Scales: SocialEyes handled 30 Neon devices in parallel, with remote troubleshooting that didn’t interrupt the events.

  • Accurate, Synchronized Data: Gaze data from all participants could be precisely aligned in time and space, even in challenging conditions like motion or changing light.

  • Revealing Group Patterns:

    • During the concert, people’s gaze was more dispersed—likely tracking performers’ movements—while the film prompted more focused attention.

    • Where people were seated influenced where they looked.

    • Projecting individual gaze onto the shared view revealed much stronger patterns of collective attention than analyzing each person separately.

    • Gaze patterns changed over time, marking key moments like the start of the film or different sections of the performance.

    • The system could even track collective blinks, hinting at shared emotional or attentional states.

Why It Matters

SocialEyes breaks down the barriers to studying real-world, group attention. It brings eye-tracking research out of the lab and into everyday life, making it possible to explore how we share focus in social, educational, and entertainment settings. This opens new doors for understanding human behavior and designing interactive technology that responds to collective attention.

Open and Accessible

Both SocialEyes and the Pupil Labs Realtime API are open-source. All code and analysis scripts are available on their respective GitHub repositories:

It’s great to see users leveraging our tools to build and extend custom applications. Looking forward to more live performances!

The Challenge

Understanding how people share attention in real-world settings—like concerts, classrooms, or collaborative work—has long been out of reach for eye-tracking research. Traditional tools are designed for one person at a time, in controlled labs. This leaves key questions about collective attention and shared gaze unanswered, because the technical hurdles of tracking many people at once “in the wild” have been too high.

A New Solution: SocialEyes

SocialEyes visualization showing eye tracking videos of audience participants with gaze overlay. Center image showing aggregate mapping of gaze for all 29 participants on a video of the performance. Source: "SocialEyes: Scaling Mobile Eye-tracking to Multi-person Social Settings" Saxena et al., CHI 2025. Used under BY-NC-SA 4.0

Researchers at McMaster University have created SocialEyes, a flexible software framework that when combined with Pupil Labs Neon, finally makes large-scale, multi-person eye-tracking possible in natural environments. Neon is lightweight, wearable, and less disruptive for participants. It works straight out of the box with no setup or calibration, accommodating diverse wearers (of all ages, ethnicities, and genders) and is robust enough for ‘in the wild’ research. This means participants can adjust the glasses, scratch their heads, or even take the glasses off and put them on again, all without needing to recalibrate. For the first time, this empowers researchers to conduct large-scale eye-tracking studies in truly natural settings!

How SocialEyes Works — In Brief

  • Remote Control: All eye-tracking devices are managed and monitored from a central interface, so researchers don’t need to intervene during events.

  • Precise Synchronization: The system keeps all devices’ clocks tightly aligned, ensuring that every moment of gaze data lines up across participants.

  • Pupil Labs Realtime API - Remote control, data streaming, and synchronization is facilitated via the Pupil Labs open-source realtime API over a local network.

  • Shared Viewpoint: SocialEyes maps each person’s gaze from their own perspective onto a common “central view”—like the stage at a concert—so researchers can see where everyone is looking together.

  • Collective Analysis: The framework offers powerful tools to visualize and analyze group attention, including heatmaps, gaze similarity, and measures of how gaze patterns shift over time.

Putting SocialEyes to the Test

SocialEyes was used in two large-scale events at McMaster’s LIVELab concert hall. Sixty participants wore Neon glasses while watching a live concert and a film screening. All devices were remotely managed, and the system collected synchronized gaze data from 30 people at once.

What Did We Learn?

  • It Scales: SocialEyes handled 30 Neon devices in parallel, with remote troubleshooting that didn’t interrupt the events.

  • Accurate, Synchronized Data: Gaze data from all participants could be precisely aligned in time and space, even in challenging conditions like motion or changing light.

  • Revealing Group Patterns:

    • During the concert, people’s gaze was more dispersed—likely tracking performers’ movements—while the film prompted more focused attention.

    • Where people were seated influenced where they looked.

    • Projecting individual gaze onto the shared view revealed much stronger patterns of collective attention than analyzing each person separately.

    • Gaze patterns changed over time, marking key moments like the start of the film or different sections of the performance.

    • The system could even track collective blinks, hinting at shared emotional or attentional states.

Why It Matters

SocialEyes breaks down the barriers to studying real-world, group attention. It brings eye-tracking research out of the lab and into everyday life, making it possible to explore how we share focus in social, educational, and entertainment settings. This opens new doors for understanding human behavior and designing interactive technology that responds to collective attention.

Open and Accessible

Both SocialEyes and the Pupil Labs Realtime API are open-source. All code and analysis scripts are available on their respective GitHub repositories:

It’s great to see users leveraging our tools to build and extend custom applications. Looking forward to more live performances!