Recently, the vanishing-step-size limit of the Sinkhorn algorithm at finite regularization parameter $\varepsilon$ was shown to be a mirror descent in the space of probability measures. We give $L^2$ contraction criteria in two time-dependent metrics induced by the mirror Hessian, which reduce to the coercivity of certain conditional expectation operators. We then give an exact identity for the entropy production rate of the Sinkhorn flow, which was previously known only to be nonpositive. Examining this rate shows that the standard semigroup analysis of diffusion processes extends systematically to the Sinkhorn flow. We show that the flow induces a reversible Markov dynamics on the target marginal as an Onsager gradient flow. We define the Dirichlet form associated to its (nonlocal) infinitesimal generator, prove a Poincar\'e inequality for it, and show that the spectral gap is strictly positive along the Sinkhorn flow whenever $\varepsilon > 0$. Lastly, we show that the entropy decay is exponential if and only if a logarithmic Sobolev inequality (LSI) holds. We give for illustration two immediate practical use-cases for the Sinkhorn LSI: as a design principle for the latent space in which generative models are trained, and as a stopping heuristic for discrete-time algorithms.
翻译:近期研究表明,在有限正则化参数$\varepsilon$下,Sinkhorn算法在步长趋于零时的极限可表示为概率测度空间中的镜像下降过程。我们提出了两种由镜像Hessian诱导的时变度量下的$L^2$收缩准则,该准则可转化为特定条件期望算子的强制性证明。随后我们给出了Sinkhorn流熵产生率的精确恒等式(该量先前仅被证明具有非正性)。通过分析该速率,我们发现扩散过程的标准半群分析可系统推广至Sinkhorn流。我们证明该流在目标边缘分布上诱导出作为Onsager梯度流的可逆马尔可夫动力学。我们定义了其(非局部)无穷小生成元对应的Dirichlet形式,证明了相应的Poincaré不等式,并表明当$\varepsilon > 0$时Sinkhorn流始终存在严格正谱隙。最后,我们证明熵衰减呈指数性的充要条件是对数Sobolev不等式(LSI)成立。作为示例,我们给出Sinkhorn LSI的两个直接应用场景:作为生成模型训练中隐空间的设计准则,以及作为离散时间算法的停止启发式策略。