Uncertainty estimation is essential for robust decision-making in the presence of ambiguous or out-of-distribution inputs. Gaussian Processes (GPs) are classical kernel-based models that offer principled uncertainty quantification and perform well on small- to medium-scale datasets. Alternatively, formulating the weight space learning problem under tensor network assumptions yields scalable tensor network kernel machines. However, these assumptions break Gaussianity, complicating standard probabilistic inference. This raises a fundamental question: how can tensor network kernel machines provide principled uncertainty estimates? We propose a novel Bayesian Tensor Network Kernel Machine (LA-TNKM) that employs a (linearized) Laplace approximation for Bayesian inference. A comprehensive set of numerical experiments shows that the proposed method consistently matches or surpasses Gaussian Processes and Bayesian Neural Networks (BNNs) across diverse UCI regression benchmarks, highlighting both its effectiveness and practical relevance.
翻译:不确定性估计对于在模糊或分布外输入下进行稳健决策至关重要。高斯过程(GPs)是经典的基于核的模型,可提供原则性的不确定性量化,并在中小型数据集上表现良好。另一种方法是在张量网络假设下构建权重空间学习问题,从而得到可扩展的张量网络核机。然而,这些假设破坏了高斯性,使标准的概率推理复杂化。这引发了一个基本问题:张量网络核机如何提供原则性的不确定性估计?我们提出了一种新颖的贝叶斯张量网络核机(LA-TNKM),它采用(线性化)拉普拉斯近似进行贝叶斯推断。全面的数值实验表明,所提出的方法在各种UCI回归基准上始终匹配或超越高斯过程和贝叶斯神经网络(BNNs),突显了其有效性和实际相关性。