We consider the problem of approximating the regression function from noisy vector-valued data by an online learning algorithm using an appropriate reproducing kernel Hilbert space (RKHS) as prior. In an online algorithm, i.i.d. samples become available one by one by a random process and are successively processed to build approximations to the regression function. We are interested in the asymptotic performance of such online approximation algorithms and show that the expected squared error in the RKHS norm can be bounded by $C^2 (m+1)^{-s/(2+s)}$, where $m$ is the current number of processed data, the parameter $0<s\leq 1$ expresses an additional smoothness assumption on the regression function and the constant $C$ depends on the variance of the input noise, the smoothness of the regression function and further parameters of the algorithm.
翻译:我们考虑利用在线学习算法,以适当的再生核希尔伯特空间(RKHS)作为先验,从含噪向量值数据中逼近回归函数的问题。在在线算法中,独立同分布的样本通过随机过程逐一获取,并依次进行处理,以构建回归函数的逼近。我们关注此类在线逼近算法的渐近性能,并证明了在RKHS范数下的期望平方误差可以被 $C^2 (m+1)^{-s/(2+s)}$ 所界定,其中 $m$ 是当前已处理的数据数量,参数 $0<s\leq 1$ 表示回归函数上的额外光滑性假设,常数 $C$ 依赖于输入噪声的方差、回归函数的光滑性以及算法的其他参数。