Touch data from mobile devices are collected at scale but reveal little about the interactions that produce them. While biomechanical simulations can illuminate motor control processes, they have not yet been developed for touch interactions. To close this gap, we propose a novel computational problem: synthesizing plausible motion directly from logs. Our key insight is a reinforcement learning-driven musculoskeletal forward simulation that generates biomechanically plausible motion sequences consistent with events recorded in touch logs. We achieve this by integrating a software emulator into a physics simulator, allowing biomechanical models to manipulate real applications in real-time. Log2Motion produces rich syntheses of user movements from touch logs, including estimates of motion, speed, accuracy, and effort. We assess the plausibility of generated movements by comparing against human data from a motion capture study and prior findings, and demonstrate Log2Motion in a large-scale dataset. Biomechanical motion synthesis provides a new way to understand log data, illuminating the ergonomics and motor control underlying touch interactions.
翻译:移动设备收集的触摸数据规模庞大,但几乎无法揭示产生这些交互的具体过程。尽管生物力学仿真能够阐明运动控制机制,但目前尚未针对触摸交互开发此类仿真方法。为填补这一空白,我们提出一个新的计算问题:直接从日志合成合理的运动。我们的核心思路是采用强化学习驱动的肌肉骨骼正向仿真,生成与触摸日志记录事件一致且在生物力学上合理的运动序列。通过将软件模拟器集成到物理仿真器中,我们实现了生物力学模型对真实应用的实时操控。Log2Motion能够从触摸日志中合成丰富的用户运动数据,包括运动轨迹、速度、准确性和用力程度的估计。我们通过将生成的运动数据与动作捕捉研究中的人类数据及已有研究成果进行对比,评估了生成运动的合理性,并在大规模数据集中验证了Log2Motion的有效性。生物力学运动合成为理解日志数据提供了新途径,能够揭示触摸交互背后的人体工程学原理与运动控制机制。