DREAM-dataset (Deep Robot-to-camera Extrinsics for Articulated Manipulators)

Introduced by Lee et al. in Camera-to-Robot Pose Estimation from a Single Image

The DREAM dataset is introduce by the paper "Camera-to-Robot Pose Estimation from a Single Image" (ICRA 2020). This dataset consists of synthetic images (both with and without domain randomlization) of three different robot manipulators (Franka Emika’s Panda, Kuka’s LBR iiwa 7 R800, and Rethink Robotics’ Baxter) , as well as real-world images of Franka Emika’s Panda taken from various RGBD cameras (XBox 360 Kinect (XK), RealSense (RS), and Azure Kinect (AK)). Each instance in the dataset contains an RGB image, keypoint 3D/2D coordinates , global camera-to-robot transformation and joint state configurations (from both revolute and prismatic joint) of the robot. Tasks like estimating robot pose (camera pose) from a single RGB image, camera-to-robot calibration can be conducted and evaluated in this dataset.

Papers


Paper Code Results Date Stars

Dataset Loaders


No data loaders found. You can submit your data loader here.

Tasks


Modalities


Languages