Steerable convolutional neural networks (SCNNs) enhance task performance by modelling geometric symmetries through equivariance constraints on weights. Yet, unknown or varying symmetries can lead to overconstrained weights and decreased performance. To address this, this paper introduces a probabilistic method to learn the degree of equivariance in SCNNs. We parameterise the degree of equivariance as a likelihood distribution over the transformation group using Fourier coefficients, offering the option to model layer-wise and shared equivariance. These likelihood distributions are regularised to ensure an interpretable degree of equivariance across the network. Advantages include the applicability to many types of equivariant networks through the flexible framework of SCNNs and the ability to learn equivariance with respect to any subgroup of any compact group without requiring additional layers. Our experiments reveal competitive performance on datasets with mixed symmetries, with learnt likelihood distributions that are representative of the underlying degree of equivariance.
翻译:可操纵卷积神经网络(SCNNs)通过对权重施加等变性约束来建模几何对称性,从而提升任务性能。然而,未知或变化的对称性可能导致权重约束过度,进而降低性能。为解决这一问题,本文提出了一种概率方法来学习SCNNs中的等变性程度。我们利用傅里叶系数将等变性程度参数化为变换群上的似然分布,并可选择建模逐层或共享的等变性。这些似然分布经过正则化处理,以确保网络中形成可解释的等变性程度。该方法的优势在于:通过SCNNs的灵活框架可适用于多种类型的等变网络,且能够学习任意紧群的任意子群上的等变性,无需添加额外网络层。实验表明,该方法在具有混合对称性的数据集上表现出竞争力,且学习到的似然分布能有效反映潜在的等变性程度。