Recently, Neural Fields have emerged as a powerful modelling paradigm to represent continuous signals. In a conditional neural field, a field is represented by a latent variable that conditions the NeF, whose parametrisation is otherwise shared over an entire dataset. We propose Equivariant Neural Fields based on cross attention transformers, in which NeFs are conditioned on a geometric conditioning variable, a latent point cloud, that enables an equivariant decoding from latent to field. Our equivariant approach induces a steerability property by which both field and latent are grounded in geometry and amenable to transformation laws if the field transforms, the latent represents transforms accordingly and vice versa. Crucially, the equivariance relation ensures that the latent is capable of (1) representing geometric patterns faitfhully, allowing for geometric reasoning in latent space, (2) weightsharing over spatially similar patterns, allowing for efficient learning of datasets of fields. These main properties are validated using classification experiments and a verification of the capability of fitting entire datasets, in comparison to other non-equivariant NeF approaches. We further validate the potential of ENFs by demonstrate unique local field editing properties.
翻译:近年来,神经场已成为表示连续信号的强大建模范式。在条件神经场中,场由潜变量表示,该变量对神经场进行条件化处理,而其参数化方式在整个数据集中共享。我们提出基于交叉注意力Transformer的等变神经场,其中神经场通过几何条件变量——潜点云进行条件化,实现了从潜空间到场空间的等变解码。我们的等变方法通过可导向性特性,使场和潜变量均以几何为基础:当场发生变换时,潜变量会相应地表征变换,反之亦然。关键的是,等变关系确保潜变量能够(1)忠实地表征几何模式,支持潜空间的几何推理;(2)在空间相似模式间共享权重,实现对场数据集的高效学习。通过分类实验及验证完整数据集拟合能力,相较于其他非等变神经场方法,这些核心特性得到了验证。我们进一步通过展示独特的局部场编辑特性,证明了等变神经场的应用潜力。