We propose Scalable Mechanistic Neural Network (S-MNN), an enhanced neural network framework designed for scientific machine learning applications involving long temporal sequences. By reformulating the original Mechanistic Neural Network (MNN) (Pervez et al., 2024), we reduce the computational time and space complexities from cubic and quadratic with respect to the sequence length, respectively, to linear. This significant improvement enables efficient modeling of long-term dynamics without sacrificing accuracy or interpretability. Extensive experiments demonstrate that S-MNN matches the original MNN in precision while substantially reducing computational resources. Consequently, S-MNN can drop-in replace the original MNN in applications, providing a practical and efficient tool for integrating mechanistic bottlenecks into neural network models of complex dynamical systems. Source code is available at https://github.com/IST-DASLab/ScalableMNN.
翻译:我们提出了一种可扩展的机理神经网络(S-MNN),这是一种增强型神经网络框架,专为涉及长时间序列的科学机器学习应用而设计。通过对原始机理神经网络(MNN)(Pervez等人,2024)进行重构,我们将计算时间和空间复杂度分别从与序列长度相关的立方和二次降低至线性。这一显著改进使得我们能够在保持精度和可解释性的同时,高效地对长期动力学进行建模。大量实验表明,S-MNN在精度上与原始MNN相当,同时显著减少了计算资源消耗。因此,在应用中,S-MNN可以直接替代原始MNN,为将机理瓶颈集成到复杂动力系统的神经网络模型中提供了一个实用且高效的工具。源代码可在 https://github.com/IST-DASLab/ScalableMNN 获取。