Humans perform a variety of interactive motions, among which duet dance is one of the most challenging interactions. However, in terms of human motion generative models, existing works are still unable to generate high-quality interactive motions, especially in the field of duet dance. On the one hand, it is due to the lack of large-scale high-quality datasets. On the other hand, it arises from the incomplete representation of interactive motion and the lack of fine-grained optimization of interactions. To address these challenges, we propose, InterDance, a large-scale duet dance dataset that significantly enhances motion quality, data scale, and the variety of dance genres. Built upon this dataset, we propose a new motion representation that can accurately and comprehensively describe interactive motion. We further introduce a diffusion-based framework with an interaction refinement guidance strategy to optimize the realism of interactions progressively. Extensive experiments demonstrate the effectiveness of our dataset and algorithm.
翻译:人类会执行多种交互动作,其中双人舞蹈是最具挑战性的交互形式之一。然而,在人体运动生成模型方面,现有研究仍无法生成高质量的交互动作,尤其是在双人舞蹈领域。一方面,这是由于缺乏大规模高质量数据集;另一方面,则源于交互动作表示的不完整性以及交互细节优化的缺失。为应对这些挑战,我们提出了InterDance——一个大规模双人舞蹈数据集,显著提升了动作质量、数据规模及舞蹈风格的多样性。基于此数据集,我们提出了一种能够准确且全面描述交互动作的新运动表示方法。我们进一步引入了一个基于扩散的框架,并采用交互细化引导策略,以逐步优化交互的真实感。大量实验证明了我们数据集与算法的有效性。