Transfer learning aims to improve performance on a target task by leveraging information from related source tasks. We propose a nonparametric regression transfer learning framework that explicitly models heterogeneity in the source-target relationship. Our approach relies on a local transfer assumption: the covariate space is partitioned into finitely many cells such that, within each cell, the target regression function can be expressed as a low-complexity transformation of the source regression function. This localized structure enables effective transfer where similarity is present while limiting negative transfer elsewhere. We introduce estimators that jointly learn the local transfer functions and the target regression, together with fully data-driven procedures that adapt to unknown partition structure and transfer strength. We establish sharp minimax rates for target regression estimation, showing that local transfer can mitigate the curse of dimensionality by exploiting reduced functional complexity. Our theoretical guarantees take the form of oracle inequalities that decompose excess risk into estimation and approximation terms, ensuring robustness to model misspecification. Numerical experiments illustrate the benefits of the proposed approach.
翻译:迁移学习旨在通过利用相关源任务的信息来提升目标任务上的性能。本文提出一种非参数回归迁移学习框架,该框架显式建模源-目标关系中的异质性。我们的方法基于局部迁移假设:协变量空间被划分为有限个单元,使得在每个单元内,目标回归函数可表示为源回归函数的低复杂度变换。这种局部化结构能够在存在相似性的区域实现有效迁移,同时限制其他区域的负迁移。我们提出了能够联合学习局部迁移函数与目标回归的估计器,并开发了完全数据驱动的自适应流程,可适应未知的分区结构与迁移强度。我们建立了目标回归估计的尖锐极小极大速率,证明局部迁移能够通过利用降低的函数复杂度来缓解维数灾难。我们的理论保证以Oracle不等式形式呈现,将超额风险分解为估计误差与近似误差,确保了对模型误设的鲁棒性。数值实验验证了所提方法的优势。