Classical neural ordinary differential equations (ODEs) are powerful tools for approximating the log-density functions in high-dimensional spaces along trajectories, where neural networks parameterize the velocity fields. This paper proposes a system of neural differential equations representing first- and second-order score functions along trajectories based on deep neural networks. We reformulate the mean field control (MFC) problem with individual noises into an unconstrained optimization problem framed by the proposed neural ODE system. Additionally, we introduce a novel regularization term to enforce characteristics of viscous Hamilton--Jacobi--Bellman (HJB) equations to be satisfied based on the evolution of the second-order score function. Examples include regularized Wasserstein proximal operators (RWPOs), probability flow matching of Fokker--Planck (FP) equations, and linear quadratic (LQ) MFC problems, which demonstrate the effectiveness and accuracy of the proposed method.
翻译:经典神经常微分方程(ODEs)是沿轨迹逼近高维空间中对数密度函数的强大工具,其中神经网络对速度场进行参数化。本文提出一个基于深度神经网络的神经微分方程系统,用于表示沿轨迹的一阶与二阶分数函数。我们将带个体噪声的平均场控制(MFC)问题重新表述为由所提神经ODE系统构建的无约束优化问题。此外,我们引入一种新颖的正则化项,基于二阶分数函数的演化过程来强制满足粘性哈密顿-雅可比-贝尔曼(HJB)方程的特性。示例包括正则化瓦瑟斯坦近端算子(RWPOs)、福克-普朗克(FP)方程的概率流匹配以及线性二次(LQ)MFC问题,这些实例验证了所提方法的有效性与精确性。