Recently, it was proved that group equivariance emerges in ensembles of neural networks as the result of full augmentation in the limit of infinitely wide neural networks (neural tangent kernel limit). In this paper, we extend this result significantly. We provide a proof that this emergence does not depend on the neural tangent kernel limit at all. We also consider stochastic settings, and furthermore general architectures. For the latter, we provide a simple sufficient condition on the relation between the architecture and the action of the group for our results to hold. We validate our findings through simple numeric experiments.
翻译:最近的研究证明,在无限宽神经网络(神经正切核极限)的全增强条件下,群等变性会在神经网络集成中自然涌现。本文显著拓展了这一结论。我们证明这种涌现现象完全不依赖于神经正切核极限。同时我们考虑了随机化设置,并进一步扩展到一般架构。对于后者,我们提出了架构与群作用关系的简单充分条件以确保结论成立。我们通过简易数值实验验证了研究结果。