We introduce OceanSplat, a novel 3D Gaussian Splatting-based approach for accurately representing 3D geometry in underwater scenes. To overcome multi-view inconsistencies caused by underwater optical degradation, our method enforces trinocular view consistency by rendering horizontally and vertically translated camera views relative to each input view and aligning them via inverse warping. Furthermore, these translated camera views are used to derive a synthetic epipolar depth prior through triangulation, which serves as a self-supervised depth regularizer. These geometric constraints facilitate the spatial optimization of 3D Gaussians and preserve scene structure in underwater environments. We also propose a depth-aware alpha adjustment that modulates the opacity of 3D Gaussians during early training based on their $z$-component and viewing direction, deterring the formation of medium-induced primitives. With our contributions, 3D Gaussians are disentangled from the scattering medium, enabling robust representation of object geometry and significantly reducing floating artifacts in reconstructed underwater scenes. Experiments on real-world underwater and simulated scenes demonstrate that OceanSplat substantially outperforms existing methods for both scene reconstruction and restoration in scattering media.
翻译:我们提出了OceanSplat,一种基于3D高斯溅射的新颖方法,用于精确表示水下场景中的三维几何结构。为了克服由水下光学退化引起的多视图不一致问题,我们的方法通过为每个输入视图渲染水平和垂直平移的相机视图,并通过逆扭曲对齐它们,从而强制实施三目视图一致性。此外,这些平移的相机视图被用于通过三角测量推导出一个合成的极线深度先验,该先验作为自监督的深度正则化器。这些几何约束促进了3D高斯函数的空间优化,并在水下环境中保留了场景结构。我们还提出了一种深度感知的alpha调整方法,该方法在早期训练阶段根据3D高斯函数的$z$分量和观察方向来调制其不透明度,从而阻止由介质诱导的基元形成。通过我们的贡献,3D高斯函数从散射介质中解耦出来,能够实现对物体几何结构的鲁棒表示,并显著减少重建水下场景中的漂浮伪影。在真实世界水下场景和模拟场景上的实验表明,OceanSplat在散射介质中的场景重建和恢复方面均显著优于现有方法。