Maximum mean discrepancies (MMDs) like the kernel Stein discrepancy (KSD) have grown central to a wide range of applications, including hypothesis testing, sampler selection, distribution approximation, and variational inference. In each setting, these kernel-based discrepancy measures are required to (i) separate a target P from other probability measures or even (ii) control weak convergence to P. In this article we derive new sufficient and necessary conditions to ensure (i) and (ii). For MMDs on separable metric spaces, we characterize those kernels that separate Bochner embeddable measures and introduce simple conditions for separating all measures with unbounded kernels and for controlling convergence with bounded kernels. We use these results on $\mathbb{R}^d$ to substantially broaden the known conditions for KSD separation and convergence control and to develop the first KSDs known to exactly metrize weak convergence to P. Along the way, we highlight the implications of our results for hypothesis testing, measuring and improving sample quality, and sampling with Stein variational gradient descent.
翻译:最大均值差异(MMDs)如核斯坦差异(KSD)已成为假设检验、采样器选择、分布逼近和变分推断等广泛应用的核心工具。在这些应用场景中,基于核的差异度量需要满足:(i)将目标分布P与其他概率测度分离,甚至(ii)控制向P的弱收敛。本文推导了确保(i)和(ii)成立的新充分必要条件。针对可分离度量空间上的MMDs,我们刻画了能够分离Bochner可嵌入测度的核特征,并提出了无界核分离所有测度、有界核控制收敛性的简明条件。通过在$\mathbb{R}^d$上应用这些结果,我们显著拓展了KSD分离与收敛控制的已知条件,并首次构建了能够精确度量向P弱收敛的KSD体系。研究过程中,我们重点阐述了这些结果对假设检验、样本质量评估与改进、以及斯坦变分梯度下降采样的理论意义。