Products

Learn

About

Careers

Products

Learn

About

Careers

FaceVR: Real-Time Facial Reenactment and Eye Gaze Control in VR

Community Stories

Author(s): Pupil Dev Team

November 25, 2016

Justus Thies et al., have developed a novel approach for real-time gaze-aware facial capture system to drive the photo-realistic reconstructed digital face in virtual reality. Their approach enables facial reenactment that can transfer facial expressions and realistic eye appearance between a source and a target actor video.

Real-Time Facial Reenactment and Eye Gaze Control in VR

Source: FaceVR: Real-Time Facial Reenactment and Eye Gaze Control in VR

Head mounted displays (HMDs) provide immersive renderings of virtual environments. But in order to do so HMDs block the majority of the actor/participant's face. In order to reconstruct the actor/participant's face, Thies et al., use an RGB camera to capture the facial performance of the participant and Pupil Labs's Oculus Rift DK2 add-on cup to capture eye movements within the HMD. The source actor's facial and eye movement data is then used to drive the photo-realistic facial animations of the target video, therefore enabling gaze-aware facial reenactment.

Check out their full paper available on arxiv - FaceVR: Real-Time Facial Reenactment and Eye Gaze Control in VR.

If you use Pupil in your research and have published work, please send us a note. We would love to feature the work here on the blog and in a list of work that cites Pupil.