We present the HOH (Human-Object-Human) Handover Dataset, a large object count dataset with 136 objects, to accelerate data-driven research on handover studies, human-robot handover implementation, and artificial intelligence (AI) on handover parameter estimation from 2D and 3D data of person interactions. HOH contains multi-view RGB and depth data, skeletons, fused point clouds, grasp type and handedness labels, object, giver hand, and receiver hand 2D and 3D segmentations, giver and receiver comfort ratings, and paired object metadata and aligned 3D models for 2,720 handover interactions spanning 136 objects and 20 giver-receiver pairs-40 with role-reversal-organized from 40 participants. We also show experimental results of neural networks trained using HOH to perform grasp, orientation, and trajectory prediction. As the only fully markerless handover capture dataset, HOH represents natural human-human handover interactions, overcoming challenges with markered datasets that require specific suiting for body tracking, and lack high-resolution hand tracking. To date, HOH is the largest handover dataset in number of objects, participants, pairs with role reversal accounted for, and total interactions captured.
翻译:我们提出HOH(人-物-人)交接数据集,这是一个包含136个物体的大数量数据集,旨在加速交接研究、人-机器人交接实现以及基于人员交互2D和3D数据的交接参数估计人工智能(AI)的数据驱动研究。HOH包含多视角RGB与深度数据、骨架、融合点云、抓取类型与利手标签、物体、给予者手部与接收者手部的2D和3D分割、给予者与接收者的舒适度评分,以及配对物体元数据和对齐的3D模型,覆盖2,720次交接交互,涉及136个物体和20对给予者-接收者组合(其中40组含角色互换),由40名参与者产生。我们还展示了使用HOH训练的神经网络在抓取、朝向和轨迹预测方面的实验结果。作为唯一完全无标记的交接捕捉数据集,HOH呈现了自然的人-人交接交互,克服了有标记数据集需要特定身体追踪装备且缺乏高分辨率手部追踪的挑战。迄今为止,HOH在物体数量、参与者数量、考虑角色互换的配对数量以及总捕捉交互次数上均为最大的交接数据集。