We perform a non-asymptotic analysis of the contrastive divergence (CD) algorithm, a training method for unnormalized models. While prior work has established that (for exponential family distributions) the CD iterates asymptotically converge at an $O(n^{-1 / 3})$ rate to the true parameter of the data distribution, we show, under some regularity assumptions, that CD can achieve the parametric rate $O(n^{-1 / 2})$. Our analysis provides results for various data batching schemes, including the fully online and minibatch ones. We additionally show that CD can be near-optimal, in the sense that its asymptotic variance is close to the Cram\'er-Rao lower bound.
翻译:本文对对比散度(CD)算法——一种非归一化模型的训练方法——进行了非渐近分析。尽管先前研究已证明(对于指数族分布)CD迭代以$O(n^{-1 / 3})$的速率渐近收敛至数据分布的真实参数,我们在若干正则性假设下证明CD能够达到参数速率$O(n^{-1 / 2})$。我们的分析涵盖了多种数据批处理方案,包括完全在线和小批量模式。此外,我们证明CD具有近优性,即其渐近方差接近克拉默-拉奥下界。