Recent advancements in neural rendering, particularly 2D Gaussian Splatting (2DGS), have shown promising results for jointly reconstructing fine appearance and geometry by leveraging 2D Gaussian surfels. However, current methods face significant challenges when rendering at arbitrary viewpoints, such as anti-aliasing for down-sampled rendering, and texture detail preservation for high-resolution rendering. We proposed a novel method to align the 2D surfels with texture maps and augment it with per-ray depth sorting and fisher-based pruning for rendering consistency and efficiency. With correct order, per-surfel texture maps significantly improve the capabilities to capture fine details. Additionally, to render high-fidelity details in varying viewpoints, we designed a frustum-based sampling method to mitigate the aliasing artifacts. Experimental results on benchmarks and our custom texture-rich dataset demonstrate that our method surpasses existing techniques, particularly in detail preservation and anti-aliasing.
翻译:近年来,神经渲染领域,特别是二维高斯泼溅(2DGS)技术取得了显著进展,通过利用二维高斯面元,在联合重建精细外观与几何方面展现出良好前景。然而,现有方法在任意视点渲染时面临重大挑战,例如下采样渲染的抗锯齿问题,以及高分辨率渲染中的纹理细节保持。我们提出了一种新颖的方法,将二维面元与纹理图对齐,并通过逐射线深度排序和基于Fisher准则的剪枝来增强渲染的一致性与效率。在正确排序的前提下,逐面元纹理图显著提升了捕捉细微细节的能力。此外,为在不同视点渲染高保真细节,我们设计了一种基于视锥体的采样方法以减轻走样伪影。在基准测试集和我们自定义的纹理丰富数据集上的实验结果表明,我们的方法超越了现有技术,尤其在细节保持和抗锯齿方面表现优异。