Gaze Estimation
74 papers with code • 9 benchmarks • 16 datasets
Gaze Estimation is a task to predict where a person is looking at given the person’s full face. The task contains two directions: 3-D gaze vector and 2-D gaze position estimation. 3-D gaze vector estimation is to predict the gaze vector, which is usually used in the automotive safety. 2-D gaze position estimation is to predict the horizontal and vertical coordinates on a 2-D screen, which allows utilizing gaze point to control a cursor for human-machine interaction.
Source: A Generalized and Robust Method Towards Practical Gaze Estimation on Smart Phone
Most implemented papers
Learning to Find Eye Region Landmarks for Remote Gaze Estimation in Unconstrained Settings
Conventional feature-based and model-based gaze estimation methods have proven to perform well in settings with controlled illumination and specialized cameras.
RITnet: Real-time Semantic Segmentation of the Eye for Gaze Tracking
Accurate eye segmentation can improve eye-gaze estimation and support interactive computing based on visual attention; however, existing eye segmentation methods suffer from issues such as person-dependent accuracy, lack of robustness, and an inability to be run in real-time.
Towards High Performance Low Complexity Calibration in Appearance Based Gaze Estimation
Appearance-based gaze estimation from RGB images provides relatively unconstrained gaze tracking.
Self-Learning Transformations for Improving Gaze and Head Redirection
Furthermore, we show that in the presence of limited amounts of real-world training data, our method allows for improvements in the downstream task of semi-supervised cross-dataset gaze estimation.
TinyTracker: Ultra-Fast and Ultra-Low-Power Edge Vision In-Sensor for Gaze Estimation
We propose TinyTracker, a highly efficient, fully quantized model for 2D gaze estimation designed to maximize the performance of the edge vision systems considered in this study.
Spatio-Temporal Attention and Gaussian Processes for Personalized Video Gaze Estimation
Additionally, our approach integrates Gaussian processes to include individual-specific traits, facilitating the personalization of our model with just a few labeled samples.
Ecological Sampling of Gaze Shifts
Visual attention guides our gaze to relevant parts of the viewed scene, yet the moment-to-moment relocation of gaze can be different among observers even though the same locations are taken into account.
Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction
Commercial head-mounted eye trackers provide useful features to customers in industry and research but are expensive and rely on closed source hardware and software.
Deep Pictorial Gaze Estimation
In this paper, we introduce a novel deep neural network architecture specifically designed for the task of gaze estimation from single eye input.
RT-GENE: Real-Time Eye Gaze Estimation in Natural Environments
We first record a novel dataset of varied gaze and head pose images in a natural environment, addressing the issue of ground truth annotation by measuring head pose using a motion capture system and eye gaze using mobile eyetracking glasses.