Implicit Neural Representations (INRs) that learn a Signed Distance Function (SDF) are a powerful tool for continuous 3D scene reconstruction. These models are trained by enforcing the Eikonal equation. We demonstrate theoretically that despite the ill-posedness of the Eikonal equation, generalization error estimates may be obtained for Neural SDFs in terms of the training error. However, training with the Eikonal loss can lead to unstable gradient flows, necessitating alternate stabilization techniques. Traditional numerical solvers for the equation have relied on viscosity approaches for regularization. We enhance Neural SDF training using this well-developed theory, and introduce a new loss formulation we call ViscoReg. We theoretically demonstrate the stability of the gradient flow equation of our proposed loss term. Empirically, ViscoReg outperforms state-of-the-art approaches such as SIREN, DiGS, and StEik without adding significant computational cost.
翻译:学习符号距离函数(SDF)的隐式神经表示(INRs)是连续三维场景重建的强大工具。这类模型通过强制满足Eikonal方程进行训练。我们从理论上证明,尽管Eikonal方程是病态的,但神经SDF的泛化误差估计仍可根据训练误差获得。然而,使用Eikonal损失进行训练可能导致不稳定的梯度流,因此需要采用替代的稳定化技术。该方程的传统数值求解器依赖于粘性方法进行正则化。我们利用这一成熟理论改进了神经SDF的训练,并提出了一种称为ViscoReg的新损失函数形式。我们从理论上证明了所提损失项的梯度流方程的稳定性。实验表明,ViscoReg在未显著增加计算成本的情况下,其性能优于SIREN、DiGS和StEik等最先进方法。