Kernel Stein discrepancies (KSDs) have emerged as a powerful tool for quantifying goodness-of-fit over the last decade, featuring numerous successful applications. To the best of our knowledge, all existing KSD estimators with known rate achieve $\sqrt n$-convergence. In this work, we present two complementary results (with different proof strategies), establishing that the minimax lower bound of KSD estimation is $n^{-1/2}$ and settling the optimality of these estimators. Our first result focuses on KSD estimation on $\mathbb R^d$ with the Langevin-Stein operator; our explicit constant for the Gaussian kernel indicates that the difficulty of KSD estimation may increase exponentially with the dimensionality $d$. Our second result settles the minimax lower bound for KSD estimation on general domains.
翻译:核Stein差异(KSDs)在过去十年中已成为量化拟合优度的有力工具,并已成功应用于众多领域。据我们所知,所有已知收敛速率的现有KSD估计量均达到$\sqrt n$收敛速度。本工作通过两种互补的证明策略,确立了KSD估计的极小极大下界为$n^{-1/2}$,从而证明了这些估计量的最优性。我们的第一个结果聚焦于采用Langevin-Stein算子在$\mathbb R^d$上的KSD估计;针对高斯核的显式常数表明,KSD估计的难度可能随维度$d$呈指数级增长。第二个结果确立了在一般域上KSD估计的极小极大下界。