Publications

Explore a collection of publications and projects, from diverse fields, that cite Pupil Labs and use Pupil Labs eye tracking hardware and software in their research.

Filters

Year

25
54
45
40
49
25
9

Fields

63
39
36
22
22
22
20
16
16
14
13
12
10
10
10
9
8
7
7
6
6
5
5
4
4
3
3
3
2
2
2
2
2
2
1
1
1
1
1
1
1
1
1
1
1
1
1
1

0-0 of 247 publications
Eye movements in real-life search are guided by task-irrelevant working-memory content
2020
Cognitive Science
Cherie Zhou, Monicque M. Lorist, Sebastiaan Mathot
bioRxiv preprint
Attention is automatically guided towards stimuli that match the contents of working memory. This has been studied extensively using simplified computer tasks, but it has never been investigated whether (yet often assumed that) memory-driven guidance also affects real-life search. Here we tested this open question in a naturalistic environment that closely resembles real life. In two experiments, participants wore a mobile eye-tracker, and memorized a color, prior to a search task in which they looked for a target word among book covers on a bookshelf. The memory color was irrelevant to the search task. Nevertheless, we found that participants' gaze was strongly guided towards book covers that matched the memory color. Crucially, this memory-driven guidance was evident from the very start of the search period. These findings support that attention is guided towards working-memory content in real-world search, and that this is fast and therefore likely reflecting an automatic process.
Use of eye tracking device to evaluate the driver’s behaviour and the infrastructures quality in relation to road safety
2020
Transportation Safety, Automotive
David Vetturi, Michela Tiboni, Giulio Maternini, Michela Bonera
Transportation Research Procedia Volume 45, 2020, Pages 587-595
Eye tracking allows to obtain important elements regarding the drivers’ behaviour during their driving activity, by employing a device that monitors the movements of the eye and therefore of the user’s observation point. In this paper it will be explained how analysing the behaviour of the drivers through the eye movements permits to evaluate the infrastructures quality in terms of road safety. Driver behaviour analysis have been conducted in urban areas, examining the observation target (cars, pedestrians, road signs, distraction elements) in quantitative terms (time of fixing each singular target). In particular, roundabout intersections and rectilinear segment of urban arterials have been examined and the records related to seven drivers’ behaviour were collected, in order to have a significant statistical variability. Only young people has considered in this study. The analyses carried out have made it possible to assess how different types of infrastructure influence the behaviour of road users, in terms of safety performance given by their design. In particular, quantitative analyzes were carried out on driving times dedicated to observing attention rather than distraction targets. From a statistical point of view, the relationship that exists between the characteristics of the driver, weather conditions and infrastructure, with driving behavior (traveling speed and attention / inattention time) was analyzed by ANOVA method.
Privacy-Preserving Eye Videos using Rubber Sheet Model
2020
HCI, Privacy
Aayush K. Chaudhary, Jeff B. Pelz
ETRA 2020 Preprint: June2-5 2020
Video-based eye trackers estimate gaze based on eye images/videos. As security and privacy concerns loom over technological advancements, tackling such challenges is crucial. We present a new approach to handle privacy issues in eye videos by replacing the current identifiable iris texture with a different iris template in the video capture pipeline based on the Rubber Sheet Model. We extend to image blending and median-value representations to demonstrate that videos can be manipulated without significantly degrading segmentation and pupil detection accuracy.
Gaze Tracking for Eye-Hand Coordination Training Systems in Virtual Reality
2020
HCI
Aunnoy K Mutasim, Anil Ufuk Batmaz, Wolfgang Stuerzlinger
CHI EA '20: Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, April 2020 Pages 1–8
Eye-hand coordination training systems are used to improve user performance during fast movements in sports training. In this work, we explored gaze tracking in a Virtual Reality (VR) sports training system with a VR headset. Twelve subjects performed a pointing study with or without passive haptic feedback. Results showed that subjects spent an average of 0.55 s to visually fnd and another 0.25 s before their fnger selected a target. We also identifed that, passive haptic feedback did not increase the performance of the user. Moreover, gaze tracker accuracy significantly deteriorated when subjects looked below their eye level. Our results also point out that practitioners/trainers should focus on reducing the time spent on searching for the next target to improve their performance through VR eye-hand coordination training systems. We believe that current VR eye-hand coordination training systems are ready to be evaluated with athletes.
Attention-Aware Brain Computer Interface to avoid Distractions in Augmented Reality
2020
HCI
Lisa-Marie Vortmann, Felix Putze
CHI EA '20: Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, April 2020 Pages 1–8
Recently, the idea of using BCIs in Augmented Reality settings to operate systems has emerged. One problem of such head-mounted displays is the distraction caused by an unavoidable display of control elements even when focused on internal thoughts. In this project, we reduced this distraction by including information about the current attentional state. A multimodal smart-home environment was altered to adapt to the user’s state of attention. The system only responded if the attentional orientation was classified as "external". The classification was based on multimodal EEG and eye tracking data. Seven users tested the attentionaware system in comparison to the unaware system. We show that the adaptation of the interface improved the usability of the system. We conclude that more systems would benefit from awareness of the user’s ongoing attentional state
Bionic Tracking: Using Eye Tracking to Track Biological Cells in Virtual Reality
2020
Biology, UI/UX
Ulrik Günther, Kyle IS Harrington, Raimund Dachselt, Ivo F. Sbalzarini
arXiv preprint arXiv:2005.00387
We present Bionic Tracking, a novel method for solving biological cell tracking problems with eye tracking in virtual reality using commodity hardware. Using gaze data, and especially smooth pursuit eye movements, we are able to track cells in time series of 3D volumetric datasets. The problem of tracking cells is ubiquitous in developmental biology, where large volumetric microscopy datasets are acquired on a daily basis, often comprising hundreds or thousands of time points that span hours or days. The image data, however, is only a means to an end, and scientists are often interested in the reconstruction of cell trajectories and cell lineage trees. Reliably tracking cells in crowded three- dimensional space over many timepoints remains an open problem, and many current approaches rely on tedious manual annotation and curation. In our Bionic Tracking approach, we substitute the usual 2D point-and-click annotation to track cells with eye tracking in a virtual reality headset, where users simply have to follow a cell with their eyes in 3D space in order to track it. We detail the interaction design of our approach and explain the graph-based algorithm used to connect different time points, also taking occlusion and user distraction into account. We demonstrate our cell tracking method using the example of two different biological datasets. Finally, we report on a user study with seven cell tracking experts, demonstrating the benefits of our approach over manual point-and-click tracking, with an estimated 2- to 10-fold speedup.
Demo of the EyeSAC System for Visual Synchronization, Cleaning, and Annotation of Eye Movement Data
2020
Eye tracking Algorithms
Ayush Kumar, Debesh Mohanty, Kuno Kurzhals, Fabian Beck, Daniel Weiskopf, Klaus Mueller
12th ACM Symposium on Eye Tracking Research & Applications (ETRA’20 Adjunct)
Eye movement data analysis plays an important role in examin-ing human cognitive processes and perceptions. Such analysis attimes needs data recording from additional sources too duringexperiments. In this paper, we study a pair programming basedcollaboration using two eye trackers, stimulus recording, and anexternal camera recording. To analyze the collected data, we intro-duce the EyeSAC system that synchronizes the data from dierentsources and that removes the noisy and missing gazes from eyetracking data with the help of visual feedback from the externalrecording. The synchronized and cleaned data is further annotatedusing our system and then exported for further analysis.
Noise-Robust Pupil Center Detection Through CNN-Based Segmentation With Shape-Prior Loss
2020
Eye tracking Algorithms
Sang Yoon Han, Hyuk Jin Kwon, Yoonsik Kim, and Nam Ik Cho
IEEE Access 8 (2020)
Detecting the pupil center plays a key role in human-computer interaction, especially for gaze tracking. The conventional deep learning-based method for this problem is to train a convolutional neural network (CNN), which takes the eye image as the input and gives the pupil center as a regression result. In this paper, we propose an indirect use of the CNN for the task, which first segments the pupil region by a CNN as a classification problem, and then finds the center of the segmented region. This is based on the observation that CNN works more robustly for the pupil segmentation than for the pupil center-point regression when the inputs are noisy IR images. Specifically, we use the UNet model for the segmentation of pupil regions in IR images and then find the pupil center as the center of mass of the segment. In designing the loss function for the segmentation, we propose a new loss term that encodes the convex shape-prior for enhancing the robustness to noise. Precisely, we penalize not only the deviation of each predicted pixel from the ground truth label but also the non-convex shape of pupils caused by the noise and reflection. For the training, we make a new dataset of 111,581 images with hand-labeled pupil regions from 29 IR eye video sequences. We also label commonly used datasets ( ExCuSe and ElSe dataset) that are considered real-world noisy ones to validate our method. Experiments show that the proposed method performs better than the conventional methods that directly find the pupil center as a regression result.
XREye: Simulating Visual Impairments in Eye-Tracked XR
2020
Medical, Computer Graphics
Katharina Krösl, Carmine Elvezio, Matthias Hürbe, Sonja Karst, Steven Feiner, Michael Wimmer
2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)
Many people suffer from visual impairments, which can be difficult for patients to describe and others to visualize. To aid in understanding what people with visual impairments experience, we demonstrate a set of medically informed simulations in eye-tracked XR of several common conditions that affect visual perception: refractive errors (myopia, hyperopia, and presbyopia), cornea disease, and age-related macular degeneration (wet and dry).
Physical and perceptual measures of walking surface complexity strongly predict gait and gaze behaviour
2020
Health and Safety
Nicholas DA Thomas, James D. Gardiner, Robin H. Crompton, and Rebecca Lawson
Human Movement Science 71 (2020)
Background Walking surfaces vary in complexity and are known to affect stability and fall risk whilst walking. However, existing studies define surfaces through descriptions only. Objective This study used a multimethod approach to measure surface complexity in order to try to characterise surfaces with respect to locomotor stability. Methods We assessed how physical measurements of walking surface complexity compared to participant's perceptual ratings of the effect of complexity on stability. Physical measurements included local slope measures from the surfaces themselves and shape complexity measured using generated surface models. Perceptual measurements assessed participants' perceived stability and surface roughness using Likert scales. We then determined whether these measurements were indicative of changes to stability as assessed by behavioural changes including eye angle, head pitch angle, muscle coactivation, walking speed and walking smoothness. Results Physical and perceptual measures were highly correlated, with more complex surfaces being perceived as more challenging to stability. Furthermore, complex surfaces, as defined from both these measurements, were associated with lowered head pitch, increased muscle coactivation and reduced walking smoothness. Significance Our findings show that walking surfaces defined as complex, based on physical measurements, are perceived as more challenging to our stability. Furthermore, certain behavioural measures relate better to these perceptual and physical measures than others. Crucially, for the first time this study defined walking surfaces objectively rather than just based on subjective descriptions. This approach could enable future researchers to compare results across walking surface studies. Moreover, perceptual measurements, which can be collected easily and efficiently, could be used as a proxy for estimating behavioural responses to different surfaces. This could be particularly valuable when determining risk of instability when walking for individuals with compromised stability.
  • ...
  • ...