Publications

Explore a collection of publications and projects, from diverse fields, that cite Pupil Labs and use Pupil Labs eye tracking hardware and software in their research.

Filters

Year

52
112
64
55
47
40
49
25
9

Product

366
56
17
14

Fields

81
54
44
39
35
33
27
27
26
24
23
20
20
19
13
11
11
11
10
9
9
6
6
6
6
6
5
5
5
5
5
5
5
5
5
4
4
4
4
4
4
4
3
3
3
3
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
0-0 of 453 publications
Assessments on Human-Computer Interaction Using Touchscreen as Control Inputs in Flight Operations
2022
Flight Operations, HCI, Usability, Ergonomics
In: Harris, D., Li, WC. (eds) Engineering Psychology and Cognitive Ergonomics. HCII 2022. Lecture Notes in Computer Science(), vol 13307. Springer, Cham.
In: Harris, D., Li, WC. (eds) Engineering Psychology and Cognitive Ergonomics. HCII 2022. Lecture Notes in Computer Science(), vol 13307. Springer, Cham.
The developing technology on innovative touchscreen applied in the cockpit can integrate control inputs and outputs on the same display in flight operations. Flight systems could be updated by modifying the touchscreen user interface without the complicated processes on reconfiguring cockpit panels. There is a potential risk on touchscreen components constrained by the issues associated with inadvertent touch, which may be defined as any system detectable touch issued to the touch sensors without the pilot’s operational consent. Pilots’ visual behaviours can be explored by using eye trackers to analyze the relationship between eye scan patterns and attention shifts while conducting monitoring tasks in flight operations. This research aims to evaluate human-computer interactions using eye tracker to investigate the safety concerns on implementation of touchscreen in flight operations. The scenario was set to conduct an instrument landing on the final approach using future system simulator. Participants were required to interact with all the control surfaces and checklists using the touchscreens located on different areas in the cockpit. Each participant performed landing scenario as pilot-flying (PF) and pilot-monitoring (PM) in random sequence. Currently PF and PM perform different tasks related to control inputs and control outputs monitoring in the flight deck. The PF’s primary obligation is to fly the aircraft’s flight path, and the PM’s main responsibility is to monitor the aircraft’s flight path and cross-check to the PF’s operational behaviours. By analyzing participants’ visual behaviours and scanning patterns, the findings on HCI related to applying touchscreen for future flight deck design would be applicable. There are some benefits on the implementation touchscreen for future flight deck design if the human-centred design principle can be integrated in the early stage.
Overview of Controllers of User Interface for Virtual Reality.
2022
Virtual Reality
Novacek, T., & Jirina, M.
PRESENCE: Virtual and Augmented Reality, 1-100.
Virtual reality has been with us for several decades already, but we are still trying to find the right ways to control it. There are a lot of controllers with various purposes and means of input, each with its advantages and disadvantages, but also with specific ways to be handled. Our hands were the primary means of input for human-computer interaction for a long time. However, now we can use movements of our eyes, our feet or even our whole body to control the virtual environment, interact with it, or move from one place to another. We can achieve this with various controllers and wearable interfaces, like eye tracking, haptic suits or treadmills. There are numerous devices that we can choose from for every category, but sometimes it can be hard to pick the one that suits our intentions best. This article summarises all types of user interface controllers for virtual reality, with their main pros and cons and their comparison.
COLET: A Dataset for Cognitive WorkLoad Estimation Based on Eye-Tracking
2022
Cognitive Workload
Ktistakis, E., Skaramagkas, V., Manousos, D., Tachos, N. S., Tripoliti, E., Fotiadis, D. I., & Tsiknakis, M.
Computer Methods and Programs in Biomedicine 00 (2022) 1–12
The cognitive workload is an important component in performance psychology, ergonomics, and human factors. Unfortunately, publicly available datasets are scarce, making it difficult to establish new approaches and comparative studies. In this work, COLET-COgnitive workLoad estimation based on Eye-Tracking dataset is presented. Forty-seven (47) individuals’ eye movements were monitored as they solved puzzles involving visual search tasks of varying complexity and duration. The authors give an indepth study of the participants’ performance during the experiments while eye and gaze features were derived from low-level eye recorded metrics, and their relationships with the experiment tasks were investigated. The results from the classification of cognitive workload levels solely based on eye data, by employing and testing a set of machine learning algorithms are also provided. The dataset is available to the academic community.
Differentiating Endogenous and Exogenous Attention Shifts Based on Fixation-Related Potentials.
2022
Attention
Vortmann, L. M., Schult, M., & Putze, F.
27th International Conference on Intelligent User Interfaces (pp. 243-257).
Attentional shifts can occur voluntarily (endogenous control) or reflexively (exogenous control). Previous studies have shown that the neural mechanisms underlying these shifts produce different activity patterns in the brain. Changes in visual-spatial attention are usually accompanied by eye movements and a fixation on the new center of attention. In this study, we analyze the fixation-related potentials in electroencephalographic recordings of 10 participants during computer screen-based viewing tasks. During task performance, we presented salient visual distractors to evoke reflexive attention shifts. Surrounding each fixation, 0.7-second data windows were extracted and labeled as “endogenous” or “exogenous”. Averaged over all participants, the balanced classification accuracy using a person-dependent Linear Discriminant Analysis reached 59.84%. In a leave-one-participant-out approach, the average classification accuracy reached 58.48%. Differentiating attention shifts, based on fixation-related potentials, could be used to deepen the understanding of human viewing behavior or as a Brain-Computer Interface for attention-aware user interface adaptations.
Protocol: Listen Carefully protocol: an exploratory case–control study of the association between listening effort and cognitive function
2022
Linguistics
Feldman, A., Patou, F., Baumann, M., Stockmarr, A., Waldemar, G., Maier, A. M., & Vogel, A.
BMJ Open, 12(3).
Introduction: A growing body of evidence suggests that hearing loss is a significant and potentially modifiable risk factor for cognitive impairment. Although the mechanisms underlying the associations between cognitive decline and hearing loss are unclear, listening effort has been posited as one of the mechanisms involved with cognitive decline in older age. To date, there has been a lack of research investigating this association, particularly among adults with mild cognitive impairment (MCI).Methods and analysis: 15–25 cognitively healthy participants and 15–25 patients with MCI (age 40–85 years) will be recruited to participate in an exploratory study investigating the association between cognitive functioning and listening effort. Both behavioural and objective measures of listening effort will be investigated. The sentence-final word identification and recall (SWIR) test will be administered with single talker non-intelligible speech background noise while monitoring pupil dilation. Evaluation of cognitive function will be carried out in a clinical setting using a battery of neuropsychological tests. This study is considered exploratory and proof of concept, with information taken to help decide the validity of larger-scale trials.
The emotional influence of different geometries in virtual spaces: A neurocognitive examination
2022
Neuropsychology, emotion
Shemesh, A., Leisman, G., Bar, M., & Grobman, Y. J.
Journal of Environmental Psychology, 81, 101802.
In this paper, a multidisciplinary approach to examining the connection between visual perception, human emotions and architectural space is presented. It details a study in which emotional reactions to architectural space geometry are empirically measured and quantified. Using various sensors, including EEG (Electroencephalography), GSR (Galvanic Skin Response), and eye-tracking (ET), we collected data from 112 individuals experiencing virtual environments (VEs), characterized by a variance of geometric manipulations. Diffusion map algorithms, as well as other statistical methods were used to analyze the data. Findings suggest that criteria of protrusion, curvature, scale and proportion of space influence the user's emotional state. Indices of ET, GSR, electrical brain activity, as well as dwelling duration and self-report liking ranks, show both “negative” and “positive” interest changes.
Applying machine learning to dissociate between stroke patients and healthy controls using eye movement features obtained from a virtual reality task
2022
Neuropsychology
Brouwer, V. H., Stuit, S., Hoogerbrugge, A., Ten Brink, A. F., Gosselt, I. K., Van der Stigchel, S., & Nijboer, T. C
Heliyon, 8(4), e09207.
Conventional neuropsychological tests do not represent the complex and dynamic situations encountered in daily life. Immersive virtual reality simulations can be used to simulate dynamic and interactive situations in a controlled setting. Adding eye tracking to such simulations may provide highly detailed outcome measures, and has great potential for neuropsychological assessment. Here, participants (83 stroke patients and 103 healthy controls) we instructed to find either 3 or 7 items from a shopping list in a virtual super market environment while eye movements were being recorded. Using Logistic Regression and Support Vector Machine models, we aimed to predict the task of the participant and whether they belonged to the stroke or the control group. With a limited number of eye movement features, our models achieved an average Area Under the Curve (AUC) of .76 in predicting whether each participant was assigned a short or long shopping list (3 or 7 items). Identifying participant as either stroke patients and controls led to an AUC of .64. In both classification tasks, the frequency with which aisles were revisited was the most dissociating feature. As such, eye movement data obtained from a virtual reality simulation contain a rich set of signatures for detecting cognitive deficits, opening the door to potential clinical applications.
The pupillary light response as a physiological index of aphantasia, sensory and phenomenological imagery strength.
2022
Mental Imagery, Pupillometry
Kay, L., Keogh, R., Andrillon, T., & Pearson, J.
Elife 11 (2022): e72484.
The pupillary light response is an important automatic physiological response which optimizes light reaching the retina. Recent work has shown that the pupil also adjusts in response to illusory brightness and a range of cognitive functions, however, it remains unclear what exactly drives these endogenous changes. Here, we show that the imagery pupillary light response correlates with objective measures of sensory imagery strength. Further, the trial-by-trial phenomenological vividness of visual imagery is tracked by the imagery pupillary light response. We also demonstrated that a group of individuals without visual imagery (aphantasia) do not show any significant evidence of an imagery pupillary light response, however they do show perceptual pupil light responses and pupil dilation with larger cognitive load. Our results provide evidence that the pupillary light response indexes the sensory strength of visual imagery. This work also provides the first physiological validation of aphantasia.
Understanding, addressing, and analysing digital eye strain in virtual reality head-mounted displays.
2022
Digital eye strain
Hirzle, T., Fischbach, F., Karlbauer, J., Jansen, P., Gugenheimer, J., Rukzio, E., & Bulling, A.
ACM Transactions on Computer-Human Interaction (TOCHI), 29(4), 1-80.
Digital eye strain (DES), caused by prolonged exposure to digital screens, stresses the visual system and negatively affects users’ well-being and productivity. While DES is well-studied in computer displays, its impact on users of virtual reality (VR) head-mounted displays (HMDs) is largely unexplored—despite that some of their key properties (e.g., the vergence-accommodation conflict) make VR-HMDs particularly prone. This work provides the first comprehensive investigation into DES in VR HMDs. We present results from a survey with 68 experienced users to understand DES symptoms in VR-HMDs. To help address DES, we investigate eye exercises resulting from survey answers and blue light filtering in three user studies (N = 71). Results demonstrate that eye exercises, but not blue light filtering, can effectively reduce DES. We conclude with an extensive analysis of the user studies and condense our findings in 10 key challenges that guide future work in this emerging research area.