This paper demonstrates that a single-layer neural network using Parametric Rectified Linear Unit (PReLU) activation can solve the XOR problem, a simple fact that has been overlooked so far. We compare this solution to the multi-layer perceptron (MLP) and the Growing Cosine Unit (GCU) activation function and explain why PReLU enables this capability. Our results show that the single-layer PReLU network can achieve 100\% success rate in a wider range of learning rates while using only three learnable parameters.
翻译:本文证明,使用参数化修正线性单元(PReLU)激活函数的单层神经网络能够解决异或问题——这一简单事实至今被学界忽视。我们将该方案与多层感知机(MLP)及增长余弦单元(GCU)激活函数进行对比,并阐释PReLU实现此能力的机理。实验结果表明,仅使用三个可学习参数的单层PReLU网络能在更宽的学习率范围内实现100%的成功率。