It is well known that eigenfunctions of a kernel play a crucial role in kernel regression. Through several examples, we demonstrate that even with the same set of eigenfunctions, the order of these functions significantly impacts regression outcomes. Simplifying the model by diagonalizing the kernel, we introduce an over-parameterized gradient descent in the realm of sequence model to capture the effects of various orders of a fixed set of eigen-functions. This method is designed to explore the impact of varying eigenfunction orders. Our theoretical results show that the over-parameterization gradient flow can adapt to the underlying structure of the signal and significantly outperform the vanilla gradient flow method. Moreover, we also demonstrate that deeper over-parameterization can further enhance the generalization capability of the model. These results not only provide a new perspective on the benefits of over-parameterization and but also offer insights into the adaptivity and generalization potential of neural networks beyond the kernel regime.
翻译:众所周知,核函数的特征函数在核回归中起着至关重要的作用。通过多个示例,我们证明了即使特征函数集合相同,这些函数的排列顺序也会显著影响回归结果。通过将核对角化以简化模型,我们在序列模型领域引入了一种过参数化的梯度下降方法,以捕捉固定特征函数集合不同排列顺序的影响。该方法旨在探究不同特征函数顺序的影响。我们的理论结果表明,过参数化梯度流能够适应信号的内在结构,并显著优于标准梯度流方法。此外,我们还证明了更深的过参数化可以进一步提升模型的泛化能力。这些结果不仅为过参数化的优势提供了新的视角,也为超越核机制下神经网络的自适应性和泛化潜力提供了见解。