In this paper, we study the finite element operator network (FEONet), an operator-learning method for parametric problems, originally introduced in J. Y. Lee, S. Ko, and Y. Hong, Finite Element Operator Network for Solving Elliptic-Type Parametric PDEs, SIAM J. Sci. Comput., 47(2), C501-C528, 2025. FEONet realizes the parameter-to-solution map on a finite element space and admits a training procedure that does not require training data, while exhibiting high accuracy and robustness across a broad class of problems. However, its computational cost increases and accuracy may deteriorate as the number of elements grows, posing notable challenges for large-scale problems. In this paper, we propose a new sparse network architecture motivated by the structure of the finite elements to address this issue. Throughout extensive numerical experiments, we show that the proposed sparse network achieves substantial improvements in computational cost and efficiency while maintaining comparable accuracy. We also establish theoretical results demonstrating that the sparse architecture can approximate the target operator effectively and provide a stability analysis ensuring reliable training and prediction.
翻译:本文研究有限元算子网络(FEONet)——一种用于参数化问题的算子学习方法,最初由J. Y. Lee、S. Ko和Y. Hong在《Finite Element Operator Network for Solving Elliptic-Type Parametric PDEs, SIAM J. Sci. Comput., 47(2), C501-C528, 2025》中提出。FEONet在有限元空间上实现参数到解的映射,其训练过程无需训练数据,且在广泛的问题类别中展现出高精度与鲁棒性。然而,随着单元数量的增加,其计算成本会上升且精度可能下降,这对大规模问题构成了显著挑战。本文基于有限元的结构特性,提出一种新型稀疏网络架构以应对此问题。通过大量数值实验,我们证明所提出的稀疏网络在保持相当精度的同时,显著提升了计算成本与效率。我们还建立了理论结果,证明该稀疏架构能有效逼近目标算子,并提供了稳定性分析以确保训练与预测的可靠性。