1 code implementation • 23 Feb 2024 • Hanxiao Jiang, Binghao Huang, Ruihai Wu, Zhuoran Li, Shubham Garg, Hooshang Nayyeri, Shenlong Wang, Yunzhu Li
Robots need to explore their surroundings to adapt to and tackle tasks in unknown environments.
no code implementations • 4 Dec 2023 • Ying Yuan, Haichuan Che, Yuzhe Qin, Binghao Huang, Zhao-Heng Yin, Kang-Won Lee, Yi Wu, Soo-Chul Lim, Xiaolong Wang
In this paper, we introduce a system that leverages visual and tactile sensory inputs to enable dexterous in-hand manipulation.
no code implementations • 11 Sep 2023 • Binghao Huang, Yuanpei Chen, Tianyu Wang, Yuzhe Qin, Yaodong Yang, Nikolay Atanasov, Xiaolong Wang
Humans throw and catch objects all the time.
no code implementations • 10 Jul 2023 • Yuzhe Qin, Wei Yang, Binghao Huang, Karl Van Wyk, Hao Su, Xiaolong Wang, Yu-Wei Chao, Dieter Fox
For real-world experiments, AnyTeleop can outperform a previous system that was designed for a specific robot hardware with a higher success rate, using the same robot.
no code implementations • 20 Mar 2023 • Zhao-Heng Yin, Binghao Huang, Yuzhe Qin, Qifeng Chen, Xiaolong Wang
Relying on touch-only sensing, we can directly deploy the policy in a real robot hand and rotate novel objects that are not presented in training.
no code implementations • 17 Nov 2022 • Yuzhe Qin, Binghao Huang, Zhao-Heng Yin, Hao Su, Xiaolong Wang
We empirically evaluate our method using an Allegro Hand to grasp novel objects in both simulation and real world.
1 code implementation • 11 Jul 2022 • Jianglong Ye, Jiashun Wang, Binghao Huang, Yuzhe Qin, Xiaolong Wang
We will first convert the large-scale human-object interaction trajectories to robot demonstrations via motion retargeting, and then use these demonstrations to train CGF.