Diffusion-based models have demonstrated impressive accuracy and generalization in solving partial differential equations (PDEs). However, they still face significant limitations, such as high sampling costs and insufficient physical consistency, stemming from their many-step iterative sampling mechanism and lack of explicit physics constraints. To address these issues, we propose Phys-Instruct, a novel physics-guided distillation framework which not only (1) compresses a pre-trained diffusion PDE solver into a few-step generator via matching generator and prior diffusion distributions to enable rapid sampling, but also (2) enhances the physics consistency by explicitly injecting PDE knowledge through a PDE distillation guidance. Physic-Instruct is built upon a solid theoretical foundation, leading to a practical physics-constrained training objective that admits tractable gradients. Across five PDE benchmarks, Phys-Instruct achieves orders-of-magnitude faster inference while reducing PDE error by more than 8 times compared to state-of-the-art diffusion baselines. Moreover, the resulting unconditional student model functions as a compact prior, enabling efficient and physically consistent inference for various downstream conditional tasks. Our results indicate that Phys-Instruct is a novel, effective, and efficient framework for ultra-fast PDE solving powered by deep generative models.
翻译:基于扩散的模型在求解偏微分方程方面已展现出令人印象深刻的准确性和泛化能力。然而,由于其多步迭代采样机制以及缺乏显式的物理约束,这些模型仍面临显著局限性,例如高昂的采样成本和物理一致性不足。为解决这些问题,我们提出了Phys-Instruct,一种新颖的物理引导蒸馏框架。该框架不仅(1)通过匹配生成器与先验扩散分布,将预训练的扩散PDE求解器压缩为一个少步生成器以实现快速采样,而且(2)通过PDE蒸馏引导显式注入PDE知识,从而增强物理一致性。Phys-Instruct建立在坚实的理论基础之上,推导出一个具有可处理梯度的实用物理约束训练目标。在五个PDE基准测试中,与最先进的扩散基线相比,Phys-Instruct实现了数量级更快的推理速度,同时将PDE误差降低了8倍以上。此外,所得的无条件学生模型可作为一个紧凑的先验,为各种下游条件任务实现高效且物理一致的推理。我们的结果表明,Phys-Instruct是一个由深度生成模型驱动的、新颖、有效且高效的超快速PDE求解框架。