Recently, it was proved that group equivariance emerges in ensembles of neural networks as the result of full augmentation in the limit of infinitely wide neural networks (neural tangent kernel limit). In this paper, we extend this result significantly. We provide a proof that this emergence does not depend on the neural tangent kernel limit at all. We also consider stochastic settings, and furthermore general architectures. For the latter, we provide a simple sufficient condition on the relation between the architecture and the action of the group for our results to hold. We validate our findings through simple numeric experiments.
翻译:近期研究证明,在无限宽神经网络(神经正切核极限)的完整数据增强条件下,群等变性会在神经网络集成中自然涌现。本文显著拓展了这一结论。我们证明该涌现现象完全不依赖于神经正切核极限条件。同时,我们考虑了随机化场景及更一般的网络架构。对于后者,我们提出了架构与群作用关系的简明充分条件,以确保结论成立。通过简单数值实验,我们验证了这些发现。