Real-time Pupil Tracking from Monocular Video for Digital Puppetry
Artsiom Ablavatski, Andrey Vakunov, Ivan Grishchenko, Karthik Raveendran, Matsvei Zhdanovich
2020-06-19Pupil Tracking
Abstract
We present a simple, real-time approach for pupil tracking from live video on mobile devices. Our method extends a state-of-the-art face mesh detector with two new components: a tiny neural network that predicts positions of the pupils in 2D, and a displacement-based estimation of the pupil blend shape coefficients. Our technique can be used to accurately control the pupil movements of a virtual puppet, and lends liveliness and energy to it. The proposed approach runs at over 50 FPS on modern phones, and enables its usage in any real-time puppeteering pipeline.
Related Papers
EyeTrAES: Fine-grained, Low-Latency Eye Tracking via Adaptive Event Slicing2024-09-27MODEL&CO: Exoplanet detection in angular differential imaging by learning across multiple observations2024-09-23A Framework for Pupil Tracking with Event Cameras2024-07-23Using Deep Learning to Increase Eye-Tracking Robustness, Accuracy, and Precision in Virtual Reality2024-03-28Retina : Low-Power Eye Tracking with Event Camera and Spiking Hardware2023-12-013ET: Efficient Event-based Eye Tracking using a Change-Based ConvLSTM Network2023-08-22Half-sibling regression meets exoplanet imaging: PSF modeling and subtraction using a flexible, domain knowledge-driven, causal framework2022-04-07Locating Objects Without Bounding Boxes2018-06-20