AIR-Act2Act is a human-human interaction dataset for teaching non-verbal social behaviors to robots. It is different from other datasets because elderly people have participated in as performers. The authors recruited 100 elderly people and two college students to perform 10 interactions in an indoor environment. The entire dataset has 5,000 interaction samples, each of which contains depth maps, body indexes and 3D skeletal data that are captured with three Microsoft Kinect v2 cameras. In addition, the dataset also contains the joint angles of a humanoid NAO robot which are converted from the human behavior that robots need to learn.

Papers


Paper Code Results Date Stars

Dataset Loaders


Tasks


Similar Datasets


License


  • Unknown

Modalities


Languages