Diffusion-based generative processes, formulated as differential equation solving, frequently balance computational speed with sample quality. Our theoretical investigation of ODE- and SDE-based solvers reveals complementary weaknesses: ODE solvers accumulate irreducible gradient error along deterministic trajectories, while SDE methods suffer from amplified discretization errors when the step budget is limited. Building upon this insight, we introduce AdaSDE, a novel single-step SDE solver that aims to unify the efficiency of ODEs with the error resilience of SDEs. Specifically, we introduce a single per-step learnable coefficient, estimated via lightweight distillation, which dynamically regulates the error correction strength to accelerate diffusion sampling. Notably, our framework can be integrated with existing solvers to enhance their capabilities. Extensive experiments demonstrate state-of-the-art performance: at 5 NFE, AdaSDE achieves FID scores of 4.18 on CIFAR-10, 8.05 on FFHQ and 6.96 on LSUN Bedroom. Codes are available in https://github.com/WLU-wry02/AdaSDE.
翻译:基于扩散的生成过程,通常被表述为微分方程求解,常常需要在计算速度与样本质量之间进行权衡。我们对基于常微分方程(ODE)和随机微分方程(SDE)的求解器进行的理论分析揭示了其互补的弱点:ODE求解器沿着确定性轨迹累积不可约的梯度误差,而SDE方法在步数预算有限时则遭受放大的离散化误差。基于这一见解,我们提出了AdaSDE,一种新颖的单步SDE求解器,旨在将ODE的效率与SDE的误差鲁棒性统一起来。具体而言,我们引入了一个单一的可学习的每步系数,通过轻量级蒸馏进行估计,该系数动态调节误差校正强度以加速扩散采样。值得注意的是,我们的框架可以与现有求解器集成以增强其能力。大量实验证明了其最先进的性能:在5次网络函数评估(NFE)下,AdaSDE在CIFAR-10上实现了4.18的FID分数,在FFHQ上实现了8.05,在LSUN Bedroom上实现了6.96。代码可在 https://github.com/WLU-wry02/AdaSDE 获取。