Multilayer perception (MLP) has permeated various disciplinary domains, ranging from bioinformatics to financial analytics, where their application has become an indispensable facet of contemporary scientific research endeavors. However, MLP has obvious drawbacks. 1), The type of activation function is single and relatively fixed, which leads to poor `representation ability' of the network, and it is often to solve simple problems with complex networks; 2), the network structure is not adaptive, it is easy to cause network structure redundant or insufficient. In this work, we propose a novel neural network paradigm X-Net promising to replace MLPs. X-Net can dynamically learn activation functions individually based on derivative information during training to improve the network's representational ability for specific tasks. At the same time, X-Net can precisely adjust the network structure at the neuron level to accommodate tasks of varying complexity and reduce computational costs. We show that X-Net outperforms MLPs in terms of representational capability. X-Net can achieve comparable or even better performance than MLP with much smaller parameters on regression and classification tasks. Specifically, in terms of the number of parameters, X-Net is only 3% of MLP on average and only 1.1% under some tasks. We also demonstrate X-Net's ability to perform scientific discovery on data from various disciplines such as energy, environment, and aerospace, where X-Net is shown to help scientists discover new laws of mathematics or physics.
翻译:多层感知机(MLP)已渗透至从生物信息学到金融分析等多个学科领域,其应用已成为当代科学研究不可或缺的组成部分。然而,MLP存在明显缺陷:1)激活函数类型单一且相对固定,导致网络“表示能力”不足,常需用复杂网络解决简单问题;2)网络结构缺乏自适应性,易导致结构冗余或不足。本工作提出一种有望替代MLP的新型神经网络范式X-Net。X-Net能够根据训练过程中的导数信息动态学习各神经元的激活函数,从而提升网络对特定任务的表示能力。同时,X-Net可在神经元级别精确调整网络结构,以适应不同复杂度的任务并降低计算成本。实验表明,X-Net在表示能力上优于MLP。在回归与分类任务中,X-Net仅需远少于MLP的参数即可达到相当甚至更优的性能。具体而言,在参数量方面,X-Net平均仅为MLP的3%,在某些任务中甚至仅占1.1%。我们还验证了X-Net在能源、环境、航空航天等多学科数据上进行科学发现的能力,证明其能帮助科学家发现新的数学或物理规律。