We present DexCanvas, a large-scale hybrid real-synthetic human manipulation dataset containing 7,000 hours of dexterous hand-object interactions seeded from 70 hours of real human demonstrations, organized across 21 fundamental manipulation types based on the Cutkosky taxonomy. Each entry combines synchronized multi-view RGB-D, high-precision mocap with MANO hand parameters, and per-frame contact points with physically consistent force profiles. Our real-to-sim pipeline uses reinforcement learning to train policies that control an actuated MANO hand in physics simulation, reproducing human demonstrations while discovering the underlying contact forces that generate the observed object motion. DexCanvas is the first manipulation dataset to combine large-scale real demonstrations, systematic skill coverage based on established taxonomies, and physics-validated contact annotations. The dataset can facilitate research in robotic manipulation learning, contact-rich control, and skill transfer across different hand morphologies.
翻译:我们提出DexCanvas——一个大规模混合现实-合成的人类操作数据集,包含7,000小时的灵巧手-物体交互数据,其种子来源于70小时的真实人类演示,并依据Cutkosky分类法划分为21种基础操作类型。每个数据条目包含同步的多视角RGB-D数据、带MANO手部参数的高精度动作捕捉信息,以及具有物理一致力场分布的逐帧接触点标注。我们的真实到仿真流程采用强化学习方法,在物理仿真中训练控制驱动式MANO手部的策略,在复现人类演示动作的同时,通过逆向推演发现产生观测物体运动的底层接触力场。DexCanvas是首个同时具备大规模真实演示数据、基于权威分类法的系统性技能覆盖、以及经过物理验证的接触标注的操作数据集。该数据集可促进机器人操作学习、密集接触控制及跨手部形态技能迁移等领域的研究。