We present DexCanvas, a large-scale hybrid real-synthetic human manipulation dataset containing 7,000 hours of dexterous hand-object interactions seeded from 70 hours of real human demonstrations, organized across 21 fundamental manipulation types based on the Cutkosky taxonomy. Each entry combines synchronized multi-view RGB-D, high-precision mocap with MANO hand parameters, and per-frame contact points with physically consistent force profiles. Our real-to-sim pipeline uses reinforcement learning to train policies that control an actuated MANO hand in physics simulation, reproducing human demonstrations while discovering the underlying contact forces that generate the observed object motion. DexCanvas is the first manipulation dataset to combine large-scale real demonstrations, systematic skill coverage based on established taxonomies, and physics-validated contact annotations. The dataset can facilitate research in robotic manipulation learning, contact-rich control, and skill transfer across different hand morphologies.
翻译:我们提出了DexCanvas,一个大规模混合现实-合成的人类操作数据集,包含7,000小时的灵巧手-物体交互数据,这些数据源自70小时的真实人类演示,并依据Cutkosky分类法组织为21种基础操作类型。每个数据条目包含同步的多视角RGB-D信息、带有MANO手部参数的高精度动作捕捉数据,以及具有物理一致力曲线的逐帧接触点标注。我们的真实到仿真流程采用强化学习方法,训练控制物理仿真中驱动式MANO手部的策略,在复现人类演示动作的同时,通过逆向推导揭示产生观测物体运动的潜在接触力。DexCanvas是首个同时具备大规模真实演示数据、基于经典分类法的系统性技能覆盖、以及经过物理验证的接触标注的操作数据集。该数据集可推动机器人操作学习、密集接触控制及跨不同手部形态的技能迁移等领域的研究。