TRAM: Global Trajectory and Motion of 3D Humans from in-the-wild Videos

26 Mar 2024  ·  Yufu Wang, ZiYun Wang, Lingjie Liu, Kostas Daniilidis ·

We propose TRAM, a two-stage method to reconstruct a human's global trajectory and motion from in-the-wild videos. TRAM robustifies SLAM to recover the camera motion in the presence of dynamic humans and uses the scene background to derive the motion scale. Using the recovered camera as a metric-scale reference frame, we introduce a video transformer model (VIMO) to regress the kinematic body motion of a human. By composing the two motions, we achieve accurate recovery of 3D humans in the world space, reducing global motion errors by 60% from prior work. https://yufu-wang.github.io/tram4d/

PDF Abstract

Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
3D Human Pose Estimation EMDB TRAM Average MPJPE (mm) 74.4 # 1
Average MPJPE-PA (mm) 45.7 # 1
Average MVE (mm) 86.6 # 1

Methods


No methods listed for this paper. Add relevant methods here