We establish a variant of the log-Sobolev and transport-information inequalities for mixture distributions. If a probability measure $π$ can be decomposed into components that individually satisfy such inequalities, then any measure $μ$ close to $π$ in relative Fisher information is close in relative entropy or transport distance to a reweighted version of $π$ with the same mixture components but possibly different weights. This provides a user-friendly interpretation of Fisher information bounds for non-log-concave measures and explains phenomena observed in the analysis of Langevin Monte Carlo for multimodal distributions.
翻译:我们针对混合分布建立了一类对数索伯列夫不等式与传输信息不等式的变体。若概率测度$π$可分解为各分量均满足此类不等式的组成部分,则任何在相对费希尔信息意义上接近$π$的测度$μ$,其在相对熵或传输距离意义上将接近$π$的某个重加权版本——该版本保持原有混合分量结构但允许权重系数不同。这为非对数凹测度的费希尔信息界提供了便于应用的阐释,并解释了多峰分布情形下朗之万蒙特卡洛算法分析中观测到的现象。