Non-conservative uncertainty bounds are key for both assessing an estimation algorithm's accuracy and in view of downstream tasks, such as its deployment in safety-critical contexts. In this paper, we derive a tight, non-asymptotic uncertainty bound for kernel-based estimation, which can also handle correlated noise sequences. Its computation relies on a mild norm-boundedness assumption on the unknown function and the noise, returning the worst-case function realization within the hypothesis class at an arbitrary query input location. The value of this function is shown to be given in terms of the posterior mean and covariance of a Gaussian process for an optimal choice of the measurement noise covariance. By rigorously analyzing the proposed approach and comparing it with other results in the literature, we show its effectiveness in returning tight and easy-to-compute bounds for kernel-based estimates.
翻译:非保守不确定性界对于评估估计算法的准确性以及面向下游任务(例如在安全关键环境中的部署)至关重要。本文针对基于核的估计推导了一个紧致的非渐近不确定性界,该界还能处理相关噪声序列。其计算依赖于对未知函数和噪声的温和范数有界性假设,返回假设类中在任意查询输入位置处的最坏情况函数实现。该函数的值被证明以高斯过程的后验均值和协方差给出,前提是测量噪声协方差经过最优选择。通过对所提方法进行严格分析并与文献中的其他结果进行比较,我们证明了该方法在返回紧致且易于计算的核估计界方面的有效性。