Quantum neural network architectures that have little-to-no inductive biases are known to face trainability and generalization issues. Inspired by a similar problem, recent breakthroughs in machine learning address this challenge by creating models encoding the symmetries of the learning task. This is materialized through the usage of equivariant neural networks whose action commutes with that of the symmetry. In this work, we import these ideas to the quantum realm by presenting a comprehensive theoretical framework to design equivariant quantum neural networks (EQNN) for essentially any relevant symmetry group. We develop multiple methods to construct equivariant layers for EQNNs and analyze their advantages and drawbacks. Our methods can find unitary or general equivariant quantum channels efficiently even when the symmetry group is exponentially large or continuous. As a special implementation, we show how standard quantum convolutional neural networks (QCNN) can be generalized to group-equivariant QCNNs where both the convolution and pooling layers are equivariant to the symmetry group. We then numerically demonstrate the effectiveness of a SU(2)-equivariant QCNN over symmetry-agnostic QCNN on a classification task of phases of matter in the bond-alternating Heisenberg model. Our framework can be readily applied to virtually all areas of quantum machine learning. Lastly, we discuss about how symmetry-informed models such as EQNNs provide hopes to alleviate central challenges such as barren plateaus, poor local minima, and sample complexity.
翻译:量子神经网络架构若缺乏归纳偏置,常面临可训练性与泛化性问题。受类似问题启发,机器学习领域的最新突破通过构建编码学习任务对称性的模型来应对这一挑战,具体表现为使用作用与对称性可交换的等变神经网络。本研究将这些思想引入量子领域,提出了一个适用于几乎所有相关对称群的等变量子神经网络(EQNN)设计框架。我们发展了多种构建EQNN等变层的方法,并分析了其优势与局限性。即使对称群呈指数级增长或连续,我们的方法仍能高效找到酉或一般等变量子信道。作为特例实现,我们展示了如何将标准量子卷积神经网络(QCNN)推广为群等变QCNN,其中卷积层与池化层均对对称群等变。随后,我们通过键交替海森堡模型中的物相分类任务,数值证明了SU(2)等变QCNN相较于对称性无关QCNN的有效性。该框架可便捷应用于量子机器学习的几乎所有领域。最后,我们探讨了EQNN等对称性感知模型如何为缓解贫瘠高原、不良局部最小值及样本复杂度等核心挑战带来希望。