Cybersickness remains a critical barrier to the widespread adoption of Virtual Reality (VR), particularly in scenarios involving intense or artificial motion cues. Among the key contributors is excessive optical flow-perceived visual motion that, when unmatched by vestibular input, leads to sensory conflict and discomfort. While previous efforts have explored geometric or hardware based mitigation strategies, such methods often rely on predefined scene structures, manual tuning, or intrusive equipment. In this work, we propose U-MAD, a lightweight, real-time, AI-based solution that suppresses perceptually disruptive optical flow directly at the image level. Unlike prior handcrafted approaches, this method learns to attenuate high-intensity motion patterns from rendered frames without requiring mesh-level editing or scene specific adaptation. Designed as a plug and play module, U-MAD integrates seamlessly into existing VR pipelines and generalizes well to procedurally generated environments. The experiments show that U-MAD consistently reduces average optical flow and enhances temporal stability across diverse scenes. A user study further confirms that reducing visual motion leads to improved perceptual comfort and alleviated cybersickness symptoms. These findings demonstrate that perceptually guided modulation of optical flow provides an effective and scalable approach to creating more user-friendly immersive experiences. The code will be released at https://github.com/XXXXX (upon publication).
翻译:晕动症仍然是虚拟现实(VR)广泛采用的关键障碍,尤其是在涉及剧烈或非自然运动提示的场景中。其主要诱因之一是过度的光流——感知到的视觉运动与前庭输入不匹配时,会导致感觉冲突和不适。以往的研究探索了基于几何或硬件的缓解策略,但这类方法通常依赖于预定义的场景结构、手动调整或侵入式设备。本研究提出U-MAD,一种轻量级、基于人工智能的实时解决方案,直接在图像层面抑制感知上具有干扰性的光流。与先前手工设计的方法不同,该方法通过学习从渲染帧中衰减高强度运动模式,无需网格级编辑或场景特定适配。U-MAD设计为即插即用模块,可无缝集成到现有VR流程中,并能很好地泛化至程序化生成的环境。实验表明,U-MAD在不同场景中持续降低平均光流并提升时间稳定性。用户研究进一步证实,减少视觉运动可改善感知舒适度并缓解晕动症状。这些发现表明,基于感知引导的光流调制为创建更人性化的沉浸式体验提供了有效且可扩展的途径。代码将在https://github.com/XXXXX发布(于论文发表时)。