The behavior of a GP regression depends on the choice of covariance function. Stationary covariance functions are preferred in machine learning applications. However, (non-periodic) stationary covariance functions are always mean reverting and can therefore exhibit pathological behavior when applied to data that does not relax to a fixed global mean value. In this paper we show that it is possible to use improper GP priors with infinite variance to define processes that are stationary but not mean reverting. To this aim, we use of non-positive kernels that can only be defined in this limit regime. The resulting posterior distributions can be computed analytically and it involves a simple correction of the usual formulas. The main contribution of the paper is the introduction of a large family of smooth non-reverting covariance functions that closely resemble the kernels commonly used in the GP literature (e.g. squared exponential and Mat\'ern class). By analyzing both synthetic and real data, we demonstrate that these non-positive kernels solve some known pathologies of mean reverting GP regression while retaining most of the favorable properties of ordinary smooth stationary kernels.
翻译:高斯过程回归的行为取决于协方差函数的选择。平稳协方差函数在机器学习应用中备受青睐。然而,(非周期性的)平稳协方差函数总是具有均值回归特性,因此当应用于不松弛到固定全局均值的数据时,可能表现出病态行为。在本文中,我们证明可以利用具有无限方差的非正则高斯过程先验来定义平稳但非均值回归的过程。为此,我们使用了仅在此极限状态下才能定义的非正定核。所得后验分布可通过解析计算得到,且涉及对常规公式的简单修正。本文的主要贡献是引入了一大类平滑的非回归协方差函数,这些函数与高斯过程文献中常用的核(例如平方指数核和马特恩类核)高度相似。通过分析合成数据和真实数据,我们证明了这些非正定核解决了均值回归高斯过程回归的某些已知病态问题,同时保留了普通平滑平稳核的大部分优良性质。