3D Gaussian splatting, a novel differentiable rendering technique, has achieved state-of-the-art novel view synthesis results with high rendering speeds and relatively low training times. However, its performance on scenes commonly seen in indoor datasets is poor due to the lack of geometric constraints during optimization. We extend 3D Gaussian splatting with depth and normal cues to tackle challenging indoor datasets and showcase techniques for efficient mesh extraction, an important downstream application. Specifically, we regularize the optimization procedure with depth information, enforce local smoothness of nearby Gaussians, and use the geometry of the 3D Gaussians supervised by normal cues to achieve better alignment with the true scene geometry. We improve depth estimation and novel view synthesis results over baselines and show how this simple yet effective regularization technique can be used to directly extract meshes from the Gaussian representation yielding more physically accurate reconstructions on indoor scenes. Our code will be released in https://github.com/maturk/dn-splatter.
翻译:三维高斯泼溅作为一种新型可微分渲染技术,以高渲染速度和相对较短的训练时间实现了最先进的新视角合成效果。然而,由于优化过程中缺乏几何约束,该技术在处理室内数据集中常见场景时性能欠佳。我们通过引入深度和法向线索扩展三维高斯泼溅方法,以应对具有挑战性的室内数据集,并展示了高效网格提取(一项重要的下游应用)的技术方案。具体而言,我们利用深度信息对优化过程进行正则化,强制邻近高斯的局部平滑性,并借助法向线索监督三维高斯几何体,使其与真实场景几何实现更优对齐。我们在深度估计和新视角合成结果上超越了基线方法,同时展示了这种简单而有效的正则化技术如何直接用于从高斯表示中提取网格,从而在室内场景中生成更符合物理真实性的重建结果。我们的代码将发布在 https://github.com/maturk/dn-splatter。