By selecting different filter functions, spectral algorithms can generate various regularization methods to solve statistical inverse problems within the learning-from-samples framework. This paper combines distributed spectral algorithms with Sobolev kernels to tackle the functional linear regression problem. The design and mathematical analysis of the algorithms require only that the functional covariates are observed at discrete sample points. Furthermore, the hypothesis function spaces of the algorithms are the Sobolev spaces generated by the Sobolev kernels, optimizing both approximation capability and flexibility. Through the establishment of regularity conditions for the target function and functional covariate, we derive matching upper and lower bounds for the convergence of the distributed spectral algorithms in the Sobolev norm. This demonstrates that the proposed regularity conditions are reasonable and that the convergence analysis under these conditions is tight, capturing the essential characteristics of functional linear regression. The analytical techniques and estimates developed in this paper also enhance existing results in the previous literature.
翻译:通过选取不同的滤波函数,谱算法能够生成多种正则化方法,在从样本学习的框架下解决统计反问题。本文结合分布式谱算法与Sobolev核来处理函数型线性回归问题。算法的设计与数学分析仅要求函数型协变量在离散采样点处被观测。此外,算法的假设函数空间是由Sobolev核生成的Sobolev空间,从而优化了逼近能力与灵活性。通过为目标函数与函数型协变量建立正则性条件,我们推导了分布式谱算法在Sobolev范数下收敛性的匹配上下界。这表明所提出的正则性条件是合理的,且在这些条件下的收敛分析是紧致的,捕捉了函数型线性回归的本质特征。本文发展的分析技术与估计结果也改进了现有文献中的相关结论。