ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.17198
14
0

Dex1B: Learning with 1B Demonstrations for Dexterous Manipulation

20 June 2025
Jianglong Ye
Keyi Wang
Chengjing Yuan
Ruihan Yang
Yiquan Li
Jiyue Zhu
Yuzhe Qin
Xueyan Zou
Xiaolong Wang
ArXiv (abs)PDFHTML
Main:8 Pages
12 Figures
Bibliography:3 Pages
6 Tables
Appendix:2 Pages
Abstract

Generating large-scale demonstrations for dexterous hand manipulation remains challenging, and several approaches have been proposed in recent years to address this. Among them, generative models have emerged as a promising paradigm, enabling the efficient creation of diverse and physically plausible demonstrations. In this paper, we introduce Dex1B, a large-scale, diverse, and high-quality demonstration dataset produced with generative models. The dataset contains one billion demonstrations for two fundamental tasks: grasping and articulation. To construct it, we propose a generative model that integrates geometric constraints to improve feasibility and applies additional conditions to enhance diversity. We validate the model on both established and newly introduced simulation benchmarks, where it significantly outperforms prior state-of-the-art methods. Furthermore, we demonstrate its effectiveness and robustness through real-world robot experiments. Our project page is atthis https URL

View on arXiv
@article{ye2025_2506.17198,
  title={ Dex1B: Learning with 1B Demonstrations for Dexterous Manipulation },
  author={ Jianglong Ye and Keyi Wang and Chengjing Yuan and Ruihan Yang and Yiquan Li and Jiyue Zhu and Yuzhe Qin and Xueyan Zou and Xiaolong Wang },
  journal={arXiv preprint arXiv:2506.17198},
  year={ 2025 }
}
Comments on this paper