Fast and accurate simulation of dynamical systems is a fundamental challenge across scientific and engineering domains. Traditional numerical integrators often face a trade-off between accuracy and computational efficiency, while existing neural network-based approaches typically require training a separate model for each case. To overcome these limitations, we introduce a novel multi-modal foundation model for large-scale simulations of differential equations: FMint-SDE (Foundation Model based on Initialization for stochastic differential equations). Based on a decoder-only transformer with in-context learning, FMint-SDE leverages numerical and textual modalities to learn a universal error-correction scheme. It is trained using prompted sequences of coarse solutions generated by conventional solvers, enabling broad generalization across diverse systems. We evaluate our models on a suite of challenging SDE benchmarks spanning applications in molecular dynamics, mechanical systems, finance, and biology. Experimental results show that our approach achieves a superior accuracy-efficiency tradeoff compared to classical solvers, underscoring the potential of FMint-SDE as a general-purpose simulation tool for dynamical systems.
翻译:动力系统的快速精确模拟是科学与工程领域的一项基础性挑战。传统数值积分器常常面临精度与计算效率之间的权衡,而现有的基于神经网络的方法通常需要为每个具体案例单独训练模型。为克服这些局限,我们提出了一种用于大规模微分方程模拟的新型多模态基础模型:FMint-SDE(基于初始化的随机微分方程基础模型)。FMint-SDE 基于一个仅包含解码器且具备上下文学习能力的 Transformer 架构,利用数值与文本两种模态来学习一种通用的误差校正方案。该模型通过使用传统求解器生成的粗略解作为提示序列进行训练,从而能够广泛泛化至不同的系统。我们在涵盖分子动力学、机械系统、金融和生物学应用的一系列具有挑战性的随机微分方程基准测试上评估了我们的模型。实验结果表明,与经典求解器相比,我们的方法在精度与效率之间取得了更优的权衡,这凸显了 FMint-SDE 作为动力系统通用模拟工具的潜力。