Equivariance is a powerful prior for learning physical dynamics, yet exact group equivariance can degrade performance if the symmetries are broken. We propose object-centric world models built with geometric algebra neural networks, providing a soft geometric inductive bias. Our models are evaluated using simulated environments of 2d rigid body dynamics with static obstacles, where we train for next-step predictions autoregressively. For long-horizon rollouts we show that the soft inductive bias of our models results in better performance in terms of physical fidelity compared to non-equivariant baseline models. The approach complements recent soft-equivariance ideas and aligns with the view that simple, well-chosen priors can yield robust generalization. These results suggest that geometric algebra offers an effective middle ground between hand-crafted physics and unstructured deep nets, delivering sample-efficient dynamics models for multi-object scenes.
翻译:等变性是学习物理动力学的一个强大先验,但如果对称性被破坏,精确的群等变性可能会降低性能。我们提出了一种基于几何代数神经网络构建的物体中心世界模型,提供了一种软几何归纳偏置。我们的模型在包含静态障碍物的二维刚体动力学模拟环境中进行评估,通过自回归方式训练进行下一步预测。对于长时程推演,我们表明,与非等变基线模型相比,我们模型的软归纳偏置在物理保真度方面表现出更好的性能。该方法补充了近期软等变性的思想,并与“简单、精心选择的先验能够产生鲁棒泛化”的观点相一致。这些结果表明,几何代数在手工物理模型与无结构深度网络之间提供了一个有效的折中方案,为多物体场景提供了样本高效的动力学模型。