Equivariant neural networks (ENNs) have been shown to be extremely effective in applications involving underlying symmetries. By construction ENNs cannot produce lower symmetry outputs given a higher symmetry input. However, symmetry breaking occurs in many physical systems and we may obtain a less symmetric stable state from an initial highly symmetric one. Hence, it is imperative that we understand how to systematically break symmetry in ENNs. In this work, we propose a novel symmetry breaking framework that is fully equivariant and is the first which fully addresses spontaneous symmetry breaking. We emphasize that our approach is general and applicable to equivariance under any group. To achieve this, we introduce the idea of symmetry breaking sets (SBS). Rather than redesign existing networks, we design sets of symmetry breaking objects which we feed into our network based on the symmetry of our inputs and outputs. We show there is a natural way to define equivariance on these sets, which gives an additional constraint. Minimizing the size of these sets equates to data efficiency. We prove that minimizing these sets translates to a well studied group theory problem, and tabulate solutions to this problem for the point groups. Finally, we provide some examples of symmetry breaking to demonstrate how our approach works in practice. The code for these examples is available at \url{https://github.com/atomicarchitects/equivariant-SBS}.
翻译:等变神经网络(ENNs)已被证明在涉及底层对称性的应用中极为有效。从构造上讲,ENNs无法在给定更高对称性输入的情况下产生较低对称性的输出。然而,对称性破缺在许多物理系统中普遍存在,我们可能从一个初始高度对称的状态获得一个对称性较低的稳定态。因此,我们必须理解如何在ENNs中系统地实现对称性破缺。在本工作中,我们提出了一种新颖的对称性破缺框架,该框架完全保持等变性,并且是首个全面处理自发对称性破缺的方法。我们强调,我们的方法是通用的,适用于任何群下的等变性。为实现这一点,我们引入了对称性破缺集合(SBS)的概念。我们并非重新设计现有网络,而是设计了一组对称性破缺对象,根据输入和输出的对称性将其馈入网络。我们证明了在这些集合上定义等变性存在一种自然的方式,这提供了一个额外的约束条件。最小化这些集合的大小等同于提高数据效率。我们证明了最小化这些集合可转化为一个已有深入研究的群论问题,并针对点群列出了该问题的解。最后,我们提供了一些对称性破缺的示例,以展示我们的方法在实践中如何运作。这些示例的代码可在 \url{https://github.com/atomicarchitects/equivariant-SBS} 获取。