TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/L2CS-Net: Fine-Grained Gaze Estimation in Unconstrained En...

L2CS-Net: Fine-Grained Gaze Estimation in Unconstrained Environments

Ahmed A. Abdelrahman, Thorsten Hempel, Aly Khalifa, Ayoub Al-Hamadi

2022-03-07Gaze EstimationGaze Prediction
PaperPDFCodeCode

Abstract

Human gaze is a crucial cue used in various applications such as human-robot interaction and virtual reality. Recently, convolution neural network (CNN) approaches have made notable progress in predicting gaze direction. However, estimating gaze in-the-wild is still a challenging problem due to the uniqueness of eye appearance, lightning conditions, and the diversity of head pose and gaze directions. In this paper, we propose a robust CNN-based model for predicting gaze in unconstrained settings. We propose to regress each gaze angle separately to improve the per-angel prediction accuracy, which will enhance the overall gaze performance. In addition, we use two identical losses, one for each angle, to improve network learning and increase its generalization. We evaluate our model with two popular datasets collected with unconstrained settings. Our proposed model achieves state-of-the-art accuracy of 3.92{\deg} and 10.41{\deg} on MPIIGaze and Gaze360 datasets, respectively. We make our code open source at https://github.com/Ahmednull/L2CS-Net.

Results

TaskDatasetMetricValueModel
Gaze EstimationGaze360Angular Error10.41L2CS
Gaze EstimationMPII GazeAngular Error3.92L2CS

Related Papers

Inference-Time Gaze Refinement for Micro-Expression Recognition: Enhancing Event-Based Eye Tracking with Motion-Aware Post-Processing2025-06-14Evaluating Sensitivity Parameters in Smartphone-Based Gaze Estimation: A Comparative Study of Appearance-Based and Infrared Eye Trackers2025-06-13EgoM2P: Egocentric Multimodal Multitask Pretraining2025-06-09MAC-Gaze: Motion-Aware Continual Calibration for Mobile Gaze Tracking2025-05-28MAGE: A Multi-task Architecture for Gaze Estimation with an Efficient Calibration Module2025-05-22Ocular Authentication: Fusion of Gaze and Periocular Modalities2025-05-22A Generalized Label Shift Perspective for Cross-Domain Gaze Estimation2025-05-19GA3CE: Unconstrained 3D Gaze Estimation with Gaze-Aware 3D Context Encoding2025-05-15