Mathematical formulas serve as the means of communication between humans and nature, encapsulating the operational laws governing natural phenomena. The concise formulation of these laws is a crucial objective in scientific research and an important challenge for artificial intelligence (AI). While traditional artificial neural networks (MLP) excel at data fitting, they often yield uninterpretable black box results that hinder our understanding of the relationship between variables x and predicted values y. Moreover, the fixed network architecture in MLP often gives rise to redundancy in both network structure and parameters. To address these issues, we propose MetaSymNet, a novel neural network that dynamically adjusts its structure in real-time, allowing for both expansion and contraction. This adaptive network employs the PANGU meta function as its activation function, which is a unique type capable of evolving into various basic functions during training to compose mathematical formulas tailored to specific needs. We then evolve the neural network into a concise, interpretable mathematical expression. To evaluate MetaSymNet's performance, we compare it with four state-of-the-art symbolic regression algorithms across more than 10 public datasets comprising 222 formulas. Our experimental results demonstrate that our algorithm outperforms others consistently regardless of noise presence or absence. Furthermore, we assess MetaSymNet against MLP and SVM regarding their fitting ability and extrapolation capability, these are two essential aspects of machine learning algorithms. The findings reveal that our algorithm excels in both areas. Finally, we compared MetaSymNet with MLP using iterative pruning in network structure complexity. The results show that MetaSymNet's network structure complexity is obviously less than MLP under the same goodness of fit.
翻译:数学公式是人类与自然交流的语言,它概括了自然现象的运行规律。对这些规律进行简洁表述是科学研究的重要目标,也是人工智能面临的关键挑战。传统人工神经网络(MLP)虽擅长数据拟合,但常产生难以解释的黑箱结果,阻碍了我们对变量x与预测值y之间关系的理解。此外,MLP固定的网络架构常导致网络结构与参数冗余。为解决这些问题,我们提出MetaSymNet——一种能够实时动态调整结构(支持扩展与收缩)的新型神经网络。该自适应网络采用PANGU元函数作为激活函数,这是一种独特的函数类型,可在训练过程中演化为多种基本函数,从而组合出适应特定需求的数学公式。随后,我们将神经网络演化为简洁、可解释的数学表达式。为评估MetaSymNet的性能,我们在涵盖222个公式的10余个公共数据集上,将其与四种先进的符号回归算法进行对比。实验结果表明,无论数据是否含噪声,我们的算法均持续优于其他方法。此外,我们从拟合能力与外推能力这两个机器学习算法的核心维度,将MetaSymNet与MLP及SVM进行比较。结果显示我们的算法在这两方面均表现优异。最后,我们通过迭代剪枝对比了MetaSymNet与MLP的网络结构复杂度。结果表明在相同拟合优度下,MetaSymNet的网络结构复杂度显著低于MLP。