The classical kernel ridge regression problem aims to find the best fit for the output $Y$ as a function of the input data $X\in \mathbb{R}^d$, with a fixed choice of regularization term imposed by a given choice of a reproducing kernel Hilbert space, such as a Sobolev space. Here we consider a generalization of the kernel ridge regression problem, by introducing an extra matrix parameter $U$, which aims to detect the scale parameters and the feature variables in the data, and thereby improve the efficiency of kernel ridge regression. This naturally leads to a nonlinear variational problem to optimize the choice of $U$. We study various foundational mathematical aspects of this variational problem, and in particular how this behaves in the presence of multiscale structures in the data.
翻译:经典的核岭回归问题旨在寻找输出$Y$作为输入数据$X\in \mathbb{R}^d$函数的最佳拟合,其正则化项由给定的再生核希尔伯特空间(如Sobolev空间)所固定。本文考虑核岭回归问题的一种推广,通过引入额外的矩阵参数$U$来检测数据中的尺度参数与特征变量,从而提升核岭回归的效率。这自然引出了一个用于优化$U$选择的非线性变分问题。我们研究了该变分问题的若干基础数学性质,特别是其在数据存在多尺度结构时的行为特征。