Reversing a diffusion process by learning its score forms the heart of diffusion-based generative modeling and for estimating properties of scientific systems. The diffusion processes that are tractable center on linear processes with a Gaussian stationary distribution. This limits the kinds of models that can be built to those that target a Gaussian prior or more generally limits the kinds of problems that can be generically solved to those that have conditionally linear score functions. In this work, we introduce a family of tractable denoising score matching objectives, called local-DSM, built using local increments of the diffusion process. We show how local-DSM melded with Taylor expansions enables automated training and score estimation with nonlinear diffusion processes. To demonstrate these ideas, we use automated-DSM to train generative models using non-Gaussian priors on challenging low dimensional distributions and the CIFAR10 image dataset. Additionally, we use the automated-DSM to learn the scores for nonlinear processes studied in statistical physics.
翻译:通过学习扩散过程的得分来实现其逆向过程,构成了基于扩散的生成建模以及科学系统属性估计的核心。目前可处理的扩散过程主要集中在具有高斯平稳分布的线性过程上。这限制了可构建的模型类型——仅能针对高斯先验进行建模,或更一般地限制了可通用解决的问题类型——仅限于那些具有条件线性得分函数的问题。在本工作中,我们引入了一族可处理的去噪得分匹配目标,称为局部-DSM,它利用扩散过程的局部增量构建。我们展示了局部-DSM与泰勒展开相结合如何实现对非线性扩散过程的自动化训练与得分估计。为验证这些思想,我们使用自动化-DSM在具有挑战性的低维分布和CIFAR10图像数据集上训练采用非高斯先验的生成模型。此外,我们利用自动化-DSM来学习统计物理学中研究的非线性过程的得分。