Optimal transport has emerged as a fundamental methodology with applications spanning multiple research areas in recent years. However, the convergence rate of the empirical estimator to its population counterpart suffers from the curse of dimensionality, which prevents its application in high-dimensional spaces. While entropic regularization has been proven to effectively mitigate the curse of dimensionality and achieve a parametric convergence rate under mild conditions, these statistical guarantees have not been extended to general regularizers. Our work bridges this gap by establishing analogous results for a broader family of regularizers. Specifically, under boundedness constraints, we prove a convergence rate of order $n^{-1/2} with respect to sample size n. Furthermore, we derive several central limit theorems for divergence regularized optimal transport.
翻译:近年来,最优传输已发展成为横跨多个研究领域的基础方法论。然而,经验估计量向其总体对应量的收敛速率受维度诅咒的影响,这阻碍了其在高维空间中的应用。虽然熵正则化已被证明能有效缓解维度诅咒,并在温和条件下达到参数化收敛速率,但这些统计保证尚未推广至一般正则化器。本研究通过为更广泛的正则化器族建立类似结果,填补了这一空白。具体而言,在有界约束条件下,我们证明了关于样本量 $n$ 的 $n^{-1/2}$ 阶收敛速率。此外,我们推导了散度正则化最优传输的若干中心极限定理。