Simulating parameter-dependent stochastic differential equations (SDEs) presents significant computational challenges, as separate high-fidelity simulations are typically required for each parameter value of interest. Despite the success of machine learning methods in learning SDE dynamics, existing approaches either require expensive neural network training for score function estimation or lack the ability to handle continuous parameter dependence. We present a training-free conditional diffusion model framework for learning stochastic flow maps of parameter-dependent SDEs, where both drift and diffusion coefficients depend on physical parameters. The key technical innovation is a joint kernel-weighted Monte Carlo estimator that approximates the conditional score function using trajectory data sampled at discrete parameter values, enabling interpolation across both state space and the continuous parameter domain. Once trained, the resulting generative model produces sample trajectories for any parameter value within the training range without retraining, significantly accelerating parameter studies, uncertainty quantification, and real-time filtering applications. The performance of the proposed approach is demonstrated via three numerical examples of increasing complexity, showing accurate approximation of conditional distributions across varying parameter values.
翻译:模拟参数依赖的随机微分方程(SDEs)面临显著的计算挑战,因为通常需要为每个关注的参数值分别进行高保真模拟。尽管机器学习方法在学习SDE动力学方面取得了成功,但现有方法要么需要昂贵的神经网络训练来估计分数函数,要么缺乏处理连续参数依赖的能力。我们提出了一种免训练的条件扩散模型框架,用于学习参数依赖SDEs的随机流映射,其中漂移系数和扩散系数均依赖于物理参数。关键技术创新是一种联合核加权蒙特卡洛估计器,它利用在离散参数值处采样的轨迹数据来近似条件分数函数,从而实现在状态空间和连续参数域上的插值。一旦训练完成,所得生成模型可为训练范围内的任意参数值生成样本轨迹,无需重新训练,显著加速了参数研究、不确定性量化和实时滤波应用。通过三个复杂度递增的数值算例,验证了所提方法的性能,展示了其在不同参数值下对条件分布的精确逼近能力。