Personalized Image-based User Authentication using Wearable Cameras
Personal devices are a part of our everyday life that operate and store users private data. Such devices face an array of threats that have become a primary concern for users. The existing authentication mechanism requires users to register and remember one static password for each device at a time. However static passwords are vulnerable to various methods of attacks.
Ngu Nguyen et al., propose a novel personalized user authentication mechanism that generates an image-based passwords from egocentric videos captured by wearable cameras.
The increase in wearable cameras and videos generated have also increase the need for egocentric video summarization research to represent the content in a more compact form. It has been shown that gaze data can improve the performance of summarization.
The researchers use Pupil to capture egocentric video and gaze data. Fixations from gaze data can be used to discriminate potentially meaningful or salient segments of the egocentric video. These segments can then be used to split up the video into groups of personalized images used in the auth challenge.
Check out the their full paper here.
If you use Pupil in your research and have published work, please send us a note. We would love to include your work here on the blog and in a list of work that cites Pupil.