We formulate well-posed continuous-time generative flows for learning distributions that are supported on low-dimensional manifolds through Wasserstein proximal regularizations of $f$-divergences. Wasserstein-1 proximal operators regularize $f$-divergences so that singular distributions can be compared. Meanwhile, Wasserstein-2 proximal operators regularize the paths of the generative flows by adding an optimal transport cost, i.e., a kinetic energy penalization. Via mean-field game theory, we show that the combination of the two proximals is critical for formulating well-posed generative flows. Generative flows can be analyzed through optimality conditions of a mean-field game (MFG), a system of a backward Hamilton-Jacobi (HJ) and a forward continuity partial differential equations (PDEs) whose solution characterizes the optimal generative flow. For learning distributions that are supported on low-dimensional manifolds, the MFG theory shows that the Wasserstein-1 proximal, which addresses the HJ terminal condition, and the Wasserstein-2 proximal, which addresses the HJ dynamics, are both necessary for the corresponding backward-forward PDE system to be well-defined and have a unique solution with provably linear flow trajectories. This implies that the corresponding generative flow is also unique and can therefore be learned in a robust manner even for learning high-dimensional distributions supported on low-dimensional manifolds. The generative flows are learned through adversarial training of continuous-time flows, which bypasses the need for reverse simulation. We demonstrate the efficacy of our approach for generating high-dimensional images without the need to resort to autoencoders or specialized architectures.
翻译:我们通过$f$-散度的Wasserstein邻近正则化,为学习支撑在低维流形上的分布构建了适定的连续时间生成流。Wasserstein-1邻近算子对$f$-散度进行正则化,使得奇异分布能够进行比较。同时,Wasserstein-2邻近算子通过添加最优传输成本(即动能惩罚)对生成流的路径进行正则化。借助平均场博弈理论,我们证明两种邻近算子的结合对于构建适定生成流至关重要。生成流可通过平均场博弈的最优性条件进行分析,该条件由反向Hamilton-Jacobi方程和正向连续性偏微分方程组成的系统描述,其解表征了最优生成流。对于学习支撑在低维流形上的分布,平均场博弈理论表明:处理Hamilton-Jacobi终端条件的Wasserstein-1邻近算子与处理Hamilton-Jacobi动力学的Wasserstein-2邻近算子,对于确保对应的反向-正向偏微分方程组良定且具有可证明线性轨迹的唯一解都是必要的。这意味着对应的生成流也具有唯一性,因此即使对于支撑在低维流形上的高维分布学习,也能以鲁棒方式进行学习。生成流通过连续时间流的对抗训练进行学习,从而避免了反向模拟的需求。我们证明了该方法在生成高维图像方面的有效性,且无需依赖自编码器或专用架构。