Ordinary differential equations (ODEs) are widely used to describe dynamical systems in science, but identifying parameters that explain experimental measurements is challenging. In particular, although ODEs are differentiable and would allow for gradient-based parameter optimization, the nonlinear dynamics of ODEs often lead to many local minima and extreme sensitivity to initial conditions. We therefore propose diffusion tempering, a novel regularization technique for probabilistic numerical methods which improves convergence of gradient-based parameter optimization in ODEs. By iteratively reducing a noise parameter of the probabilistic integrator, the proposed method converges more reliably to the true parameters. We demonstrate that our method is effective for dynamical systems of different complexity and show that it obtains reliable parameter estimates for a Hodgkin-Huxley model with a practically relevant number of parameters.
翻译:常微分方程在科学领域被广泛用于描述动力系统,但确定能够解释实验测量结果的参数具有挑战性。尽管常微分方程具有可微性,理论上允许基于梯度的参数优化,但其非线性动力学特性常导致大量局部极小值以及对初始条件的极端敏感性。为此,我们提出扩散温度调节——一种用于概率数值方法的新型正则化技术,该技术能显著改善常微分方程中基于梯度的参数优化收敛性。通过迭代降低概率积分器的噪声参数,所提方法能够更可靠地收敛至真实参数。我们通过不同复杂度的动力系统验证了该方法的有效性,并证明其能在霍奇金-赫胥黎模型中,针对具有实际应用意义的参数规模获得可靠的参数估计结果。