In this paper, we conduct a comprehensive analysis of generalization properties of Kernel Ridge Regression (KRR) in the noiseless regime, a scenario crucial to scientific computing, where data are often generated via computer simulations. We prove that KRR can attain the minimax optimal rate, which depends on both the eigenvalue decay of the associated kernel and the relative smoothness of target functions. Particularly, when the eigenvalue decays exponentially fast, KRR achieves the spectral accuracy, i.e., a convergence rate faster than any polynomial. Moreover, the numerical experiments well corroborate our theoretical findings. Our proof leverages a novel extension of the duality framework introduced by Chen et al. (2023), which could be useful in analyzing kernel-based methods beyond the scope of this work.
翻译:本文对无噪声情况下核岭回归(KRR)的泛化性质进行了全面分析。无噪声情形在科学计算中至关重要,因为此类场景的数据通常通过计算机模拟生成。我们证明了KRR能够达到极小极大最优速率,该速率取决于相关核的特征值衰减速度与目标函数的相对光滑性。特别地,当特征值呈指数级快速衰减时,KRR可实现谱精度,即收敛速度快于任意多项式级。此外,数值实验充分验证了我们的理论发现。本文的证明基于对Chen等人(2023)提出的对偶框架的新颖推广,该框架有望用于分析本研究范畴之外的基于核的方法。