We introduce the setting of continuous index learning, in which a function of many variables varies only along a small number of directions at each point. For efficient estimation, it is beneficial for a learning algorithm to adapt, near each point $x$, to the subspace that captures the local variability of the function $f$. We pose this task as kernel adaptation along a manifold with noise, and introduce Local EGOP learning, a recursive algorithm that utilizes the Expected Gradient Outer Product (EGOP) quadratic form as both a metric and inverse-covariance of our target distribution. We prove that Local EGOP learning adapts to the regularity of the function of interest, showing that under a supervised noisy manifold hypothesis, intrinsic dimensional learning rates are achieved for arbitrarily high-dimensional noise. Empirically, we compare our algorithm to the feature learning capabilities of deep learning. Additionally, we demonstrate improved regression quality compared to two-layer neural networks in the continuous single-index setting.
翻译:本文引入连续索引学习这一设定,其中多变量函数在每个点处仅沿少数方向变化。为实现高效估计,学习算法需要在每个点$x$附近自适应地调整到捕获函数$f$局部变异性的子空间。我们将此任务形式化为沿带噪声流形的核自适应问题,并提出局部EGOP学习算法——一种递归算法,该算法利用期望梯度外积(EGOP)二次型同时作为目标分布的度量和逆协方差矩阵。我们证明局部EGOP学习能自适应目标函数的正则性,表明在带噪声的监督流形假设下,该算法对任意高维噪声都能达到本征维度的学习速率。实证研究中,我们将本算法与深度学习的特征学习能力进行对比,并在连续单索引设定中展示其相对于双层神经网络回归质量的提升。