We propose Derivative Learning (DERL), a supervised approach that models physical systems by learning their partial derivatives. We also leverage DERL to build physical models incrementally, by designing a distillation protocol that effectively transfers knowledge from a pre-trained model to a student one. We provide theoretical guarantees that DERL can learn the true physical system, being consistent with the underlying physical laws, even when using empirical derivatives. DERL outperforms state-of-the-art methods in generalizing an ODE to unseen initial conditions and a parametric PDE to unseen parameters. We also design a method based on DERL to transfer physical knowledge across models by extending them to new portions of the physical domain and a new range of PDE parameters. This introduces a new pipeline to build physical models incrementally in multiple stages.
翻译:我们提出导数学习(DERL),一种通过偏导数学习物理系统的监督方法。我们还利用DERL逐步构建物理模型,通过设计一种蒸馏协议,将预训练模型的知识有效迁移到学生模型中。我们提供了理论保证,证明即使使用经验导数,DERL也能学习真实的物理系统,与基础物理定律保持一致。在将常微分方程推广到未见初始条件以及将参数化偏微分方程推广到未见参数方面,DERL优于现有最先进方法。我们还基于DERL设计了一种方法,通过将模型扩展到物理域的新部分和偏微分方程参数的新范围,实现跨模型物理知识迁移。这为分阶段逐步构建物理模型引入了一种新的流程。