Interfacial dynamics in two-phase flows govern momentum, heat, and mass transfer, yet remain difficult to measure experimentally. Classical techniques face intrinsic limitations near moving interfaces, while existing neural rendering methods target single-phase flows with diffuse boundaries and cannot handle sharp, deformable liquid-vapor interfaces. We propose SurfPhase, a novel model for reconstructing 3D interfacial dynamics from sparse camera views. Our approach integrates dynamic Gaussian surfels with a signed distance function formulation for geometric consistency, and leverages a video diffusion model to synthesize novel-view videos to refine reconstruction from sparse observations. We evaluate on a new dataset of high-speed pool boiling videos, demonstrating high-quality view synthesis and velocity estimation from only two camera views. Project website: https://yuegao.me/SurfPhase.
翻译:两相流中的界面动力学控制着动量、热和质量的传递,但其实验测量仍存在困难。经典技术在运动界面附近存在固有局限,而现有的神经渲染方法主要针对具有弥散边界的单相流,无法处理尖锐、可形变的液-气界面。本文提出SurfPhase,一种从稀疏相机视角重建三维界面动力学的新型模型。该方法将动态高斯面元与符号距离函数表述相结合以保证几何一致性,并利用视频扩散模型合成新视角视频以优化稀疏观测下的重建效果。我们在新构建的高速池沸腾视频数据集上进行评估,结果表明仅需两个相机视角即可实现高质量的视图合成与速度估计。项目网站:https://yuegao.me/SurfPhase。