Neural Controlled Differential Equations (NCDEs) are a state-of-the-art tool for supervised learning with irregularly sampled time series (Kidger, 2020). However, no theoretical analysis of their performance has been provided yet, and it remains unclear in particular how the irregularity of the time series affects their predictions. By merging the rich theory of controlled differential equations (CDE) and Lipschitz-based measures of the complexity of deep neural nets, we take a first step towards the theoretical understanding of NCDE. Our first result is a generalization bound for this class of predictors that depends on the regularity of the time series data. In a second time, we leverage the continuity of the flow of CDEs to provide a detailed analysis of both the sampling-induced bias and the approximation bias. Regarding this last result, we show how classical approximation results on neural nets may transfer to NCDEs. Our theoretical results are validated through a series of experiments.
翻译:神经控制微分方程(NCDE)是处理不规则采样时间序列监督学习的前沿工具(Kidger, 2020)。然而,目前尚未有其性能的理论分析,尤其不清楚时间序列的不规则性如何影响其预测。通过融合控制微分方程(CDE)的丰富理论与基于 Lipschitz 度量的深度神经网络复杂度分析,我们朝着理解 NCDE 的理论方向迈出了第一步。我们的第一个结果是为该类预测器建立了一个依赖于时间序列数据规律性的泛化界。其次,我们利用 CDE 流的连续性,对采样引起的偏差和逼近偏差进行了详细分析。关于后一项结果,我们展示了经典神经网络逼近理论如何迁移至 NCDE。我们的理论结果通过一系列实验得到了验证。