We propose an evolutionary Multi-objective Optimization for Replica-Exchange-based Physics-informed operator-learning Networks (Morephy-Net) to solve parametric partial differential equations (PDEs) in noisy data regimes, for both forward prediction and inverse identification. Existing physics-informed neural networks and operator-learning models (e.g., DeepONets and Fourier neural operators) often face three coupled challenges: (i) balancing data/operator and physics residual losses, (ii) maintaining robustness under noisy or sparse observations, and (iii) providing reliable uncertainty quantification. Morephy-Net addresses these issues by integrating: (i) evolutionary multi-objective optimization that treats data/operator and physics residual terms as separate objectives and searches the Pareto front, thereby avoiding ad hoc loss weighting; (ii) replica-exchange stochastic gradient Langevin dynamics to enhance global exploration and stabilize training in non-convex landscapes; and (iii) Bayesian uncertainty quantification obtained from stochastic sampling. We validate Morephy-Net on representative forward and inverse problems, including the one-dimensional Burgers equation and the time-fractional mixed diffusion--wave equation. The results demonstrate consistent improvements in accuracy, noise robustness, and calibrated uncertainty estimates over standard operator-learning baselines.
翻译:我们提出了一种用于基于副本交换的物理信息算子学习网络的进化多目标优化方法(Morephy-Net),以解决含噪声数据体系下的参数化偏微分方程(PDEs)的正向预测和反演识别问题。现有的物理信息神经网络和算子学习模型(例如,DeepONets和傅里叶神经算子)通常面临三个相互关联的挑战:(i)平衡数据/算子损失与物理残差损失,(ii)在噪声或稀疏观测下保持鲁棒性,以及(iii)提供可靠的不确定性量化。Morephy-Net通过整合以下方法解决这些问题:(i)进化多目标优化,将数据/算子项和物理残差项视为独立的目标并搜索帕累托前沿,从而避免临时的损失加权;(ii)副本交换随机梯度朗之万动力学,以增强全局探索能力并在非凸景观中稳定训练;(iii)通过随机采样获得的贝叶斯不确定性量化。我们在代表性的正向和反演问题上验证了Morephy-Net,包括一维Burgers方程和分数阶混合扩散-波方程。结果表明,与标准的算子学习基线相比,Morephy-Net在准确性、噪声鲁棒性和校准的不确定性估计方面均取得了持续的改进。