3D scene stylization approaches based on Neural Radiance Fields (NeRF) achieve promising results by optimizing with Nearest Neighbor Feature Matching (NNFM) loss. However, NNFM loss does not consider global style information. In addition, the implicit representation of NeRF limits their fine-grained control over the resulting scenes. In this paper, we introduce ABC-GS, a novel framework based on 3D Gaussian Splatting to achieve high-quality 3D style transfer. To this end, a controllable matching stage is designed to achieve precise alignment between scene content and style features through segmentation masks. Moreover, a style transfer loss function based on feature alignment is proposed to ensure that the outcomes of style transfer accurately reflect the global style of the reference image. Furthermore, the original geometric information of the scene is preserved with the depth loss and Gaussian regularization terms. Extensive experiments show that our ABC-GS provides controllability of style transfer and achieves stylization results that are more faithfully aligned with the global style of the chosen artistic reference. Our homepage is available at https://vpx-ecnu.github.io/ABC-GS-website.
翻译:基于神经辐射场(NeRF)的3D场景风格化方法通过最近邻特征匹配(NNFM)损失进行优化,取得了有希望的结果。然而,NNFM损失未考虑全局风格信息。此外,NeRF的隐式表示限制了其对生成场景的细粒度控制。本文提出ABC-GS,一种基于3D高斯溅射的新型框架,以实现高质量的3D风格迁移。为此,我们设计了一个可控匹配阶段,通过分割掩码实现场景内容与风格特征之间的精确对齐。此外,提出了一种基于特征对齐的风格迁移损失函数,以确保风格迁移的结果准确反映参考图像的全局风格。同时,通过深度损失和高斯正则化项保留了场景的原始几何信息。大量实验表明,我们的ABC-GS提供了风格迁移的可控性,并实现了与所选艺术参考的全局风格更忠实对齐的风格化结果。我们的主页地址为 https://vpx-ecnu.github.io/ABC-GS-website。