Outdoor scene reconstruction remains challenging due to the stark contrast between well-textured, nearby regions and distant backgrounds dominated by low detail, uneven illumination, and sky effects. We introduce a two-stage Gaussian Splatting framework that explicitly separates and optimizes these regions, yielding higher-fidelity novel view synthesis. In stage one, background primitives are initialized within a spherical shell and optimized using a loss that combines a background-only photometric term with two geometric regularizers: one constraining Gaussians to remain inside the shell, and another aligning them with local tangential planes. In stage two, foreground Gaussians are initialized from a Structure-from-Motion reconstruction, added and refined using the standard rendering loss, while the background set remains fixed but contributes to the final image formation. Experiments on diverse outdoor datasets show that our method reduces background artifacts and improves perceptual quality compared to state-of-the-art baselines. Moreover, the explicit background separation enables automatic, object-free environment map estimation, opening new possibilities for photorealistic outdoor rendering and mixed-reality applications.
翻译:户外场景重建仍然具有挑战性,原因在于纹理丰富、距离较近的区域与由低细节、不均匀光照和天空效果主导的远处背景之间存在鲜明对比。我们引入了一个两阶段高斯泼溅框架,该框架明确分离并优化这些区域,从而实现了更高保真度的新视角合成。在第一阶段,背景基元在一个球形壳内初始化,并使用一个结合了仅背景光度项和两个几何正则化项的损失函数进行优化:一个约束高斯保持在壳内,另一个使其与局部切平面对齐。在第二阶段,前景高斯从运动恢复结构重建中初始化,并使用标准渲染损失进行添加和细化,而背景集合保持固定,但参与最终的图像形成。在多种户外数据集上的实验表明,与最先进的基线方法相比,我们的方法减少了背景伪影并提高了感知质量。此外,明确的背景分离实现了自动的、无物体的环境贴图估计,为逼真的户外渲染和混合现实应用开辟了新的可能性。