We introduce the setting of continuous index learning, in which a function of many variables varies only along a small number of directions at each point. For efficient estimation, it is beneficial for a learning algorithm to adapt, near each point $x$, to the subspace that captures the local variability of the function $f$. We pose this task as kernel adaptation along a manifold with noise, and introduce Local EGOP learning, a recursive algorithm that utilizes the Expected Gradient Outer Product (EGOP) quadratic form as both a metric and inverse-covariance of our target distribution. We prove that Local EGOP learning adapts to the regularity of the function of interest, showing that under a supervised noisy manifold hypothesis, intrinsic dimensional learning rates are achieved for arbitrarily high-dimensional noise. Empirically, we compare our algorithm to the feature learning capabilities of deep learning. Additionally, we demonstrate improved regression quality compared to two-layer neural networks in the continuous single-index setting.
翻译:本文引入了连续索引学习这一设定,其中多变量函数在每个点处仅沿少数方向变化。为实现高效估计,学习算法在每点$x$附近自适应地调整至捕获函数$f$局部变异性的子空间是有益的。我们将此任务建模为沿带噪声流形的核自适应问题,并提出局部EGOP学习算法——一种利用期望梯度外积(EGOP)二次型同时作为目标分布度量和逆协方差的递归算法。我们证明局部EGOP学习能自适应目标函数的正则性,表明在监督噪声流形假设下,该算法对任意高维噪声均可达到本征维度的学习速率。实证研究中,我们将所提算法与深度学习的特征学习能力进行对比,并在连续单索引设定中展示了相较于双层神经网络回归质量的提升。