Learning a stationary diffusion amounts to estimating the parameters of a stochastic differential equation whose stationary distribution matches a target distribution. We build on the recently introduced kernel deviation from stationarity (KDS), which enforces stationarity by evaluating expectations of the diffusion's generator in a reproducing kernel Hilbert space. Leveraging the connection between KDS and Stein discrepancies, we introduce the Stein-type KDS (SKDS) as an alternative formulation. We prove that a vanishing SKDS guarantees alignment of the learned diffusion's stationary distribution with the target. Furthermore, under broad parametrizations, SKDS is convex with an empirical version that is $ε$-quasiconvex with high probability. Empirically, learning with SKDS attains comparable accuracy to KDS while substantially reducing computational cost and yields improvements over the majority of competitive baselines.
翻译:学习平稳扩散等价于估计一个随机微分方程的参数,其平稳分布与目标分布相匹配。我们基于最近提出的核偏离平稳性(KDS)方法,该方法通过在再生核希尔伯特空间中评估扩散生成子的期望来强制平稳性。利用KDS与Stein差异之间的联系,我们引入Stein型KDS(SKDS)作为替代形式。我们证明,SKDS趋近于零可确保学习扩散的平稳分布与目标分布对齐。此外,在广泛参数化条件下,SKDS具有凸性,其经验版本以高概率具有$ε$-拟凸性。实证研究表明,使用SKDS进行学习在达到与KDS相当精度的同时,显著降低了计算成本,并且在多数竞争基线方法上取得了改进。