Incorporating group symmetries via equivariance into neural networks has emerged as a robust approach for overcoming the efficiency and data demands of modern deep learning. While most existing approaches, such as group convolutions and averaging-based methods, focus on compact, finite, or low-dimensional groups with linear actions, this work explores how equivariance can be extended to infinite-dimensional groups. We propose a strategy designed to induce diffeomorphism equivariance in pre-trained neural networks via energy-based canonicalisation. Formulating equivariance as an optimisation problem allows us to access the rich toolbox of already established differentiable image registration methods. Empirical results on segmentation and classification tasks confirm that our approach achieves approximate equivariance and generalises to unseen transformations without relying on extensive data augmentation or retraining.
翻译:通过等变性将群对称性融入神经网络已成为克服现代深度学习效率与数据需求挑战的有效途径。现有方法(如群卷积和基于平均的方法)主要关注具有线性作用的紧致、有限或低维群,而本研究探索了如何将等变性推广到无限维群。我们提出一种基于能量规范化的策略,旨在为预训练神经网络引入微分同胚等变性。将等变性表述为优化问题,使我们能够利用已成熟的微分图像配准方法工具箱。在分割和分类任务上的实验结果表明,该方法实现了近似等变性,并能泛化到未见过的变换,且无需依赖大规模数据增强或重新训练。