Gaussian processes (GPs) are crucial in machine learning for quantifying uncertainty in predictions. However, their associated covariance matrices, defined by kernel functions, are typically dense and large-scale, posing significant computational challenges. This paper introduces a matrix-free method that utilizes the Non-equispaced Fast Fourier Transform (NFFT) to achieve nearly linear complexity in the multiplication of kernel matrices and their derivatives with vectors for a predetermined accuracy level. To address high-dimensional problems, we propose an additive kernel approach. Each sub-kernel in this approach captures lower-order feature interactions, allowing for the efficient application of the NFFT method and potentially increasing accuracy across various real-world datasets. Additionally, we implement a preconditioning strategy that accelerates hyperparameter tuning, further improving the efficiency and effectiveness of GPs.
翻译:高斯过程(GPs)在机器学习中对于量化预测的不确定性至关重要。然而,由其核函数定义的协方差矩阵通常是稠密且大规模的,这带来了显著的计算挑战。本文提出了一种无矩阵方法,该方法利用非均匀快速傅里叶变换(NFFT),在预定精度水平下,实现了核矩阵及其导数与向量相乘的近乎线性复杂度。针对高维问题,我们提出了一种加性核方法。该方法中的每个子核捕捉较低阶的特征交互,从而能够有效应用NFFT方法,并可能在各种现实世界数据集中提高准确性。此外,我们实现了一种预条件化策略,以加速超参数调优,从而进一步提升高斯过程的效率和有效性。