Understanding the stability and long-time behavior of generative models is a fundamental problem in modern machine learning. This paper provides quantitative bounds on the sampling error of score-based generative models by leveraging stability and forgetting properties of the Markov chain associated with the reverse-time dynamics. Under weak assumptions, we provide the two structural properties to ensure the propagation of initialization and discretization errors of the backward process: a Lyapunov drift condition and a Doeblin-type minorization condition. A practical consequence is quantitative stability of the sampling procedure, as the reverse diffusion dynamics induces a contraction mechanism along the sampling trajectory. Our results clarify the role of stochastic dynamics in score-based models and provide a principled framework for analyzing propagation of errors in such approaches.
翻译:理解生成模型的稳定性与长期行为是现代机器学习中的一个基本问题。本文通过利用与逆向时间动力学相关的马尔可夫链的稳定性与遗忘性质,为基于分数的生成模型的采样误差提供了定量界。在弱假设下,我们提出了两个确保反向过程初始化误差与离散化误差传播的结构性质:一个李雅普诺夫漂移条件与一个Doeblin型小化条件。一个实际结果是采样过程的定量稳定性,因为反向扩散动力学沿着采样轨迹诱导了一种收缩机制。我们的结果阐明了随机动力学在基于分数的模型中的作用,并为分析此类方法中误差的传播提供了一个原则性框架。