In this paper, we first propose a novel method for transferring material transformations across different scenes. Building on disentangled Neural Radiance Field (NeRF) representations, our approach learns to map Bidirectional Reflectance Distribution Functions (BRDF) from pairs of scenes observed in varying conditions, such as dry and wet. The learned transformations can then be applied to unseen scenes with similar materials, therefore effectively rendering the transformation learned with an arbitrary level of intensity. Extensive experiments on synthetic scenes and real-world objects validate the effectiveness of our approach, showing that it can learn various transformations such as wetness, painting, coating, etc. Our results highlight not only the versatility of our method but also its potential for practical applications in computer graphics. We publish our method implementation, along with our synthetic/real datasets on https://github.com/astra-vision/BRDFTransform
翻译:本文首次提出了一种跨场景材料变换的新方法。基于解耦神经辐射场(NeRF)表示,我们的方法学习从不同条件(如干燥与潮湿)下观察到的场景对中映射双向反射分布函数(BRDF)。习得的变换随后可应用于具有相似材料的未见场景,从而能以任意强度有效渲染所学习的变换。在合成场景和真实物体上进行的大量实验验证了我们方法的有效性,表明其能够学习多种变换,如湿润、涂漆、镀层等。我们的结果不仅凸显了该方法的通用性,也展现了其在计算机图形学实际应用中的潜力。我们在https://github.com/astra-vision/BRDFTransform 上公开了方法实现及合成/真实数据集。