Training implicit neural representations (INRs) to capture fine-scale details typically relies on iterative backpropagation and is often hindered by spectral bias when the target exhibits highly non-uniform frequency content. We propose ELM-INR, a backpropagation-free INR that decomposes the domain into overlapping subdomains and fits each local problem using an Extreme Learning Machine (ELM) in closed form, replacing iterative optimization with stable linear least-squares solutions. This design yields fast and numerically robust reconstruction by combining local predictors through a partition of unity. To understand where approximation becomes difficult under fixed local capacity, we analyze the method from a spectral Barron norm perspective, which reveals that global reconstruction error is dominated by regions with high spectral complexity. Building on this insight, we introduce BEAM, an adaptive mesh refinement strategy that balances spectral complexity across subdomains to improve reconstruction quality in capacity-constrained regimes.
翻译:训练隐式神经表示(INR)以捕捉精细细节通常依赖于迭代反向传播,且当目标呈现高度非均匀频率成分时,常受谱偏差的制约。本文提出ELM-INR,一种无需反向传播的INR方法,其将定义域分解为重叠子域,并采用极限学习机(ELM)以闭式解拟合每个局部问题,从而以稳定的线性最小二乘解替代迭代优化。该设计通过单位分解组合局部预测器,实现了快速且数值稳健的重建。为理解在固定局部容量下近似变得困难的区域,我们从谱Barron范数的角度分析了该方法,揭示全局重建误差主要由具有高谱复杂度的区域主导。基于这一洞见,我们引入BEAM——一种自适应网格细化策略,其通过平衡子域间的谱复杂度来提升容量受限场景下的重建质量。