Score-based generative modeling, implemented through probability flow ODEs, has shown impressive results in numerous practical settings. However, most convergence guarantees rely on restrictive regularity assumptions on the target distribution -- such as strong log-concavity or bounded support. This work establishes non-asymptotic convergence bounds in the 2-Wasserstein distance for a general class of probability flow ODEs under considerably weaker assumptions: weak log-concavity and Lipschitz continuity of the score function. Our framework accommodates non-log-concave distributions, such as Gaussian mixtures, and explicitly accounts for initialization errors, score approximation errors, and effects of discretization via an exponential integrator scheme. Bridging a key theoretical challenge in diffusion-based generative modeling, our results extend convergence theory to more realistic data distributions and practical ODE solvers. We provide concrete guarantees for the efficiency and correctness of the sampling algorithm, complementing the empirical success of diffusion models with rigorous theory. Moreover, from a practical perspective, our explicit rates might be helpful in choosing hyperparameters, such as the step size in the discretization.
翻译:基于分数的生成建模通过概率流常微分方程实现,已在众多实际场景中展现出卓越性能。然而,现有收敛性保证大多依赖于目标分布的强正则性假设——例如强对数凹性或紧支撑条件。本研究在显著弱化的假设条件下(弱对数凹性与分数函数的Lipschitz连续性),为广义概率流常微分方程类建立了2-Wasserstein距离下的非渐近收敛界。该框架可容纳非对数凹分布(如高斯混合分布),并显式处理了初始化误差、分数逼近误差以及指数积分器离散化效应。通过弥合扩散生成建模领域的核心理论挑战,本研究成果将收敛理论拓展至更符合实际的数据分布与实用ODE求解器。我们为采样算法的效率与正确性提供了具体保证,以严格理论补充了扩散模型的实证成功。此外,从实践角度而言,本文推导的显式收敛速率可为超参数选择(如离散化步长)提供理论依据。