The activation functions are fundamental to neural networks as they introduce non-linearity into data relationships, thereby enabling deep networks to approximate complex data relations. Existing efforts to enhance neural network performance have predominantly focused on developing new mathematical functions. However, we find that a well-designed combination of existing activation functions within a neural network can also achieve this objective. In this paper, we introduce the Combined Units activation (CombU), which employs different activation functions at various dimensions across different layers. This approach can be theoretically proven to fit most mathematical expressions accurately. The experiments conducted on four mathematical expression datasets, compared against six State-Of-The-Art (SOTA) activation function algorithms, demonstrate that CombU outperforms all SOTA algorithms in 10 out of 16 metrics and ranks in the top three for the remaining six metrics.
翻译:激活函数是神经网络的基础,它们向数据关系中引入非线性,从而使深度网络能够逼近复杂的数据关系。现有提升神经网络性能的研究主要集中于开发新的数学函数。然而,我们发现,在神经网络内部对现有激活函数进行精心设计的组合也能实现这一目标。本文提出了组合单元激活函数(CombU),它在不同层的各个维度上采用不同的激活函数。理论上可以证明,这种方法能够精确拟合大多数数学表达式。在四个数学表达式数据集上进行的实验,与六种最先进的激活函数算法进行比较,结果表明CombU在16项评估指标中的10项上优于所有最先进算法,并在其余6项指标中均位列前三。