An asymptotic theory is established for linear functionals of the predictive function given by kernel ridge regression, when the reproducing kernel Hilbert space is equivalent to a Sobolev space. The theory covers a wide variety of linear functionals, including point evaluations, evaluation of derivatives, $L_2$ inner products, etc. We establish the upper and lower bounds of the estimates and their asymptotic normality. It is shown that $\lambda\sim n^{-1}$ is the universal optimal order of magnitude for the smoothing parameter to balance the variance and the worst-case bias. The theory also implies that the optimal $L_\infty$ error of kernel ridge regression can be attained under the optimal smoothing parameter $\lambda\sim n^{-1}\log n$. These optimal rates for the smoothing parameter differ from the known optimal rate $\lambda\sim n^{-\frac{2m}{2m+d}}$ that minimizes the $L_2$ error of the kernel ridge regression.
翻译:当再生核希尔伯特空间等价于索伯列夫空间时,本文建立了核岭回归预测函数线性泛函的渐近理论。该理论涵盖多种线性泛函,包括点估计、导数估计、$L_2$内积等。我们建立了估计量的上下界及其渐近正态性。研究表明,$\lambda\sim n^{-1}$是平衡方差与最坏情况偏差时平滑参数的普适最优量级。该理论同时表明,在最优平滑参数$\lambda\sim n^{-1}\log n$下,核岭回归的最优$L_\infty$误差可以达到。这些平滑参数的最优速率不同于已知的、最小化核岭回归$L_2$误差的最优速率$\lambda\sim n^{-\frac{2m}{2m+d}}$。