We propose a framework for transfer learning of discount curves across different fixed-income product classes. Motivated by challenges in estimating discount curves from sparse or noisy data, we extend kernel ridge regression (KR) to a vector-valued setting, formulating a convex optimization problem in a vector-valued reproducing kernel Hilbert space (RKHS). Each component of the solution corresponds to the discount curve implied by a specific product class. We introduce an additional regularization term motivated by economic principles, promoting smoothness of spread curves between product classes, and show that it leads to a valid separable kernel structure. A main theoretical contribution is a decomposition of the vector-valued RKHS norm induced by separable kernels. We further provide a Gaussian process interpretation of vector-valued KR, enabling quantification of estimation uncertainty. Illustrative examples show how transfer learning tightens confidence intervals compared to single-curve estimation. An extensive masking experiment demonstrates that transfer learning significantly improves extrapolation performance.
翻译:我们提出了一种跨不同固定收益产品类别进行贴现曲线迁移学习的框架。针对从稀疏或噪声数据中估计贴现曲线所面临的挑战,我们将核岭回归(KR)扩展至向量值设定,在向量值再生核希尔伯特空间(RKHS)中构建了一个凸优化问题。解的每个分量对应特定产品类别隐含的贴现曲线。我们引入了一个基于经济学原理的额外正则化项,以促进产品类别间利差曲线的平滑性,并证明该正则化项导出了一个有效的可分离核结构。一个主要的理论贡献是对由可分离核诱导的向量值RKHS范数进行了分解。我们进一步给出了向量值KR的高斯过程解释,从而能够量化估计不确定性。示例分析表明,与单曲线估计相比,迁移学习能够收紧置信区间。一项广泛的掩蔽实验证明,迁移学习显著提升了外推性能。