This paper establishes convergence rates for learning elliptic pseudo-differential operators, a fundamental operator class in partial differential equations and mathematical physics. In a wavelet-Galerkin framework, we formulate learning over this class as a structured infinite-dimensional regression problem with multiscale sparsity. Building on this structure, we propose a sparse, data- and computation-efficient estimator, which leverages a novel matrix compression scheme tailored to the learning task and a nested-support strategy to balance approximation and estimation errors. In addition to obtaining convergence rates for the estimator, we show that the learned operator induces an efficient and stable Galerkin solver whose numerical error matches its statistical accuracy. Our results therefore contribute to bringing together operator learning, data-driven solvers, and wavelet methods in scientific computing.
翻译:本文针对椭圆型伪微分算子——偏微分方程与数学物理中的一类基本算子——建立了学习收敛速率。在小波-伽辽金框架下,我们将该类算子的学习问题表述为一个具有多尺度稀疏性的结构化无穷维回归问题。基于此结构,我们提出了一种稀疏、数据与计算高效的估计器,该估计器利用了一种专为学习任务设计的新型矩阵压缩方案,以及一种通过嵌套支撑策略来平衡逼近误差与估计误差的方法。除获得估计器的收敛速率外,我们还证明了所学算子可导出一个高效且稳定的伽辽金求解器,其数值误差与统计精度相匹配。因此,我们的研究成果有助于将算子学习、数据驱动求解器与小波方法在科学计算领域相结合。