We propose a method to enhance the stability of a neural ordinary differential equation (neural ODE) by means of a control over the Lipschitz constant $C$ of its flow. Since it is known that $C$ depends on the logarithmic norm of the Jacobian matrix associated with the neural ODE, we tune this parameter at our convenience by suitably perturbing the Jacobian matrix with a perturbation as small as possible in Frobenius norm. We do so by introducing an optimization problem for which we propose a nested two-level algorithm. For a given perturbation size, the inner level computes the optimal perturbation with a fixed Frobenius norm, while the outer level tunes the perturbation amplitude. We embed the proposed algorithm in the training of the neural ODE to improve its stability. Numerical experiments on the MNIST and FashionMNIST datasets show that an image classifier including a neural ODE in its architecture trained according to our strategy is more stable than the same classifier trained in the classical way, and therefore, it is more robust and less vulnerable to adversarial attacks.
翻译:我们提出一种通过控制神经常微分方程(neural ODE)流映射的Lipschitz常数$C$来增强其稳定性的方法。由于已知$C$取决于神经ODE关联的雅可比矩阵的对数范数,我们通过以尽可能小的Frobenius范数扰动雅可比矩阵,从而按需调节该参数。为此我们引入一个优化问题,并提出一种嵌套的双层算法:在给定扰动幅度时,内层计算具有固定Frobenius范数的最优扰动,而外层调节扰动幅度。我们将该算法嵌入神经ODE的训练过程中以提升其稳定性。在MNIST和FashionMNIST数据集上的数值实验表明,采用本策略训练的、在架构中包含神经ODE的图像分类器,比传统方式训练的同款分类器具有更高的稳定性,因而其鲁棒性更强且更不易受对抗攻击影响。