The reconstruction of indoor scenes remains challenging due to the inherent complexity of spatial structures and the prevalence of textureless regions. Recent advancements in 3D Gaussian Splatting have improved novel view synthesis with accelerated processing but have yet to deliver comparable performance in surface reconstruction. In this paper, we introduce 2DGS-Room, a novel method leveraging 2D Gaussian Splatting for high-fidelity indoor scene reconstruction. Specifically, we employ a seed-guided mechanism to control the distribution of 2D Gaussians, with the density of seed points dynamically optimized through adaptive growth and pruning mechanisms. To further improve geometric accuracy, we incorporate monocular depth and normal priors to provide constraints for details and textureless regions respectively. Additionally, multi-view consistency constraints are employed to mitigate artifacts and further enhance reconstruction quality. Extensive experiments on ScanNet and ScanNet++ datasets demonstrate that our method achieves state-of-the-art performance in indoor scene reconstruction.
翻译:室内场景的重建因其空间结构固有的复杂性以及大面积无纹理区域的存在而持续面临挑战。三维高斯泼溅技术的最新进展通过加速处理提升了新视角合成效果,但在表面重建方面尚未展现出可比的性能。本文提出2DGS-Room,一种利用二维高斯泼溅实现高保真室内场景重建的新方法。具体而言,我们采用种子引导机制来控制二维高斯分布的布局,并通过自适应生长与剪枝机制动态优化种子点的密度。为进一步提升几何精度,我们引入单目深度与法向量先验,分别对细节区域和无纹理区域提供约束。此外,采用多视角一致性约束以减少伪影并进一步提升重建质量。在ScanNet和ScanNet++数据集上的大量实验表明,本方法在室内场景重建任务中达到了最先进的性能水平。