We consider linear approximation based on function evaluations in reproducing kernel Hilbert spaces of certain analytic weighted power series kernels and stationary kernels on the interval $[-1,1]$. Both classes contain the popular Gaussian kernel $K(x, y) = \exp(-\tfrac{1}{2}\varepsilon^2(x-y)^2)$. For weighted power series kernels we derive almost matching upper and lower bounds on the worst-case error. When applied to the Gaussian kernel, our results state that, up to a sub-exponential factor, the $n$th minimal error decays as $(\varepsilon/2)^n (n!)^{-1/2}$. The proofs are based on weighted polynomial interpolation and classical polynomial coefficient estimates that we use to bound the Hilbert space norm of a weighted polynomial fooling function.
翻译:我们考虑在区间$[-1,1]$上基于函数值求值的线性逼近问题,所涉及的再生核希尔伯特空间由两类解析核构成:加权幂级数核与平稳核。这两类核均包含广泛使用的高斯核$K(x, y) = \exp(-\tfrac{1}{2}\varepsilon^2(x-y)^2)$。对于加权幂级数核,我们推导了最坏情况误差近乎匹配的上下界。将结论应用于高斯核时,我们的结果表明:在忽略次指数因子的意义下,第$n$阶最小误差以$(\varepsilon/2)^n (n!)^{-1/2}$的速率衰减。证明基于加权多项式插值及经典多项式系数估计,我们利用这些工具界定了加权多项式欺骗函数的希尔伯特空间范数。