Egocentric Activity Recognition
14 papers with code • 2 benchmarks • 4 datasets
Libraries
Use these libraries to find Egocentric Activity Recognition models and implementationsMost implemented papers
Ego-Exo: Transferring Visual Representations from Third-person to First-person Videos
We introduce an approach for pre-training egocentric video models using large-scale third-person video datasets.
Group Contextualization for Video Recognition
By utilizing calibrators to embed feature with four different kinds of contexts in parallel, the learnt representation is expected to be more resilient to diverse types of activities.
Towards Continual Egocentric Activity Recognition: A Multi-modal Egocentric Activity Dataset for Continual Learning
However, the deficiency of related dataset hinders the development of multi-modal deep learning for egocentric activity recognition.
WEAR: An Outdoor Sports Dataset for Wearable and Egocentric Activity Recognition
Though research has shown the complementarity of camera- and inertial-based data, datasets which offer both egocentric video and inertial-based sensor data remain scarce.