Quantum neural networks (QNNs) based on parametrized quantum circuits are promising candidates for machine learning applications, yet many architectures lack clear connections to classical models, potentially limiting their ability to leverage established classical neural network techniques. We examine QNNs built from SWAP test circuits and discuss their equivalence to classical two-layer feedforward networks with quadratic activations under amplitude encoding. Evaluation on real-world and synthetic datasets shows that while this architecture learns many practical binary classification tasks, it has fundamental expressivity limitations: polynomial activation functions do not satisfy the universal approximation theorem, and we show analytically that the architecture cannot learn the parity check function beyond two dimensions, regardless of network size. To address this, we introduce generalized SWAP test circuits with multiple Fredkin gates sharing an ancilla, implementing product layers with polynomial activations of arbitrary even degree. This modification enables successful learning of parity check functions in arbitrary dimensions as well as binary n-spiral tasks, and we provide numerical evidence that the expressivity enhancement extends to alternative encoding schemes such as angle (Z) and ZZ feature maps. We validate the practical feasibility of our proposed architecture by implementing a classically pretrained instance on the IBM Torino quantum processor, achieving 84% classification accuracy on the three-dimensional parity check despite hardware noise. Our work establishes a framework for analyzing and enhancing QNN expressivity through correspondence with classical architectures, and demonstrates that SWAP test-based QNNs possess broad representational capacity relevant to both classical and potentially quantum learning tasks.
翻译:基于参数化量子电路的量子神经网络(QNNs)在机器学习应用中前景广阔,然而许多架构缺乏与经典模型的明确关联,这可能限制其利用现有经典神经网络技术的能力。本文研究了基于SWAP测试电路构建的QNNs,并论证了在振幅编码下其等价于具有二次激活函数的经典双层前馈网络。通过在真实数据集和合成数据集上的评估发现,尽管该架构能够学习多种实用二分类任务,但其存在根本性的表达能力局限:多项式激活函数不满足通用逼近定理,且我们通过解析证明该架构无法学习二维以上的奇偶校验函数(无论网络规模如何)。为突破此限制,我们提出了采用多弗雷德金门共享辅助量子位的广义SWAP测试电路,实现了具有任意偶数次多项式激活函数的乘积层。该改进方案成功实现了任意维度奇偶校验函数及二元n螺旋任务的学习,并通过数值实验证明这种表达能力增强可扩展至角度(Z)编码和ZZ特征映射等替代编码方案。我们在IBM Torino量子处理器上部署了经典预训练的实例,在三维奇偶校验任务中达到84%的分类准确率(含硬件噪声),验证了所提架构的实际可行性。本研究通过建立与经典架构的对应关系,为分析和增强QNN表达能力提供了理论框架,并证明基于SWAP测试的QNNs具有适用于经典及潜在量子学习任务的广泛表征能力。