Underwater images are altered by the physical characteristics of the medium through which light rays pass before reaching the optical sensor. Scattering and wavelength-dependent absorption significantly modify the captured colors depending on the distance of observed elements to the image plane. In this paper, we aim to recover an image of the scene as if the water had no effect on light propagation. We introduce SUCRe, a novel method that exploits the scene's 3D structure for underwater color restoration. By following points in multiple images and tracking their intensities at different distances to the sensor, we constrain the optimization of the parameters in an underwater image formation model and retrieve unattenuated pixel intensities. We conduct extensive quantitative and qualitative analyses of our approach in a variety of scenarios ranging from natural light to deep-sea environments using three underwater datasets acquired from real-world scenarios and one synthetic dataset. We also compare the performance of the proposed approach with that of a wide range of existing state-of-the-art methods. The results demonstrate a consistent benefit of exploiting multiple views across a spectrum of objective metrics. Our code is publicly available at https://github.com/clementinboittiaux/sucre.
翻译:水下图像因光线在到达光学传感器前所经介质的物理特性而发生改变。散射和波长依赖的吸收会显著改变观测元素到图像平面距离所对应的捕获颜色。本文旨在恢复场景图像,使其如同水介质对光传播无影响。我们提出SUCRe方法,一种利用场景三维结构进行水下颜色恢复的创新技术。通过在多幅图像中追踪特征点,并记录这些点在不同传感器距离下的强度值,我们约束水下成像模型参数的优化过程,从而恢复未被衰减的像素强度。基于三个真实世界场景水下数据集及一个合成数据集,我们针对从自然光到深海环境等多种场景进行了广泛的定量与定性分析。同时,将所提方法与现有多种最先进方法进行了性能对比。实验结果表明,在客观指标体系中,利用多视图观测具有一致优势。我们的代码已在https://github.com/clementinboittiaux/sucre 公开。