Neural Ordinary Differential Equations (ODEs) represent a significant advancement at the intersection of machine learning and dynamical systems, offering a continuous-time analog to discrete neural networks. Despite their promise, deploying neural ODEs in practical applications often encounters the challenge of stiffness, a condition where rapid variations in some components of the solution demand prohibitively small time steps for explicit solvers. This work addresses the stiffness issue when employing neural ODEs for model order reduction by introducing a suitable reparametrization in time. The considered map is data-driven and it is induced by the adaptive time-stepping of an implicit solver on a reference solution. We show the map produces a nonstiff system that can be cheaply solved with an explicit time integration scheme. The original, stiff, time dynamic is recovered by means of a map learnt by a neural network that connects the state space to the time reparametrization. We validate our method through extensive experiments, demonstrating improvements in efficiency for the neural ODE inference while maintaining robustness and accuracy. The neural network model also showcases good generalization properties for times beyond the training data.
翻译:神经常微分方程(ODEs)代表了机器学习与动力系统交叉领域的重要进展,为离散神经网络提供了连续时间类比。尽管前景广阔,但在实际应用中部署神经常微分方程常面临刚性问题的挑战——当解中某些分量快速变化时,显式求解器需要采用小到难以承受的时间步长。本研究通过引入合适的时间重参数化方法,解决了使用神经常微分方程进行模型降阶时的刚性问题。所构建的映射是数据驱动的,由隐式求解器在参考解上的自适应时间步进所诱导。我们证明该映射能生成非刚性系统,可通过显式时间积分方案进行高效求解。原始刚性时间动态通过神经网络学习的映射得以恢复,该映射将状态空间与时间重参数化相连接。我们通过大量实验验证了该方法,证明其在保持鲁棒性和精度的同时,显著提升了神经常微分方程推理的效率。该神经网络模型还展现出对超出训练数据时间范围的良好泛化特性。