Egocentric Activity Recognition
14 papers with code • 2 benchmarks • 4 datasets
Libraries
Use these libraries to find Egocentric Activity Recognition models and implementationsLatest papers
LSTA: Long Short-Term Attention for Egocentric Action Recognition
Egocentric activity recognition is one of the most challenging tasks in video analysis.
Attention is All We Need: Nailing Down Object-centric Attention for Egocentric Activity Recognition
Our model is built on the observation that egocentric activities are highly characterized by the objects and their locations in the video.
A Correlation Based Feature Representation for First-Person Activity Recognition
The per-frame (per-segment) extracted features are considered as a set of time series, and inter and intra-time series relations are employed to represent the video descriptors.
First-Person Hand Action Benchmark with RGB-D Videos and 3D Hand Pose Annotations
Our dataset and experiments can be of interest to communities of 3D hand pose estimation, 6D object pose, and robotics as well as action recognition.