Modeling deformable objects - especially continuum materials - in a way that is physically plausible, generalizable, and data-efficient remains challenging across 3D vision, graphics, and robotic manipulation. Many existing methods oversimplify the rich dynamics of deformable objects or require large training sets, which often limits generalization. We introduce embodied MPM (EMPM), a deformable object modeling and simulation framework built on a differentiable Material Point Method (MPM) simulator that captures the dynamics of challenging materials. From multi-view RGB-D videos, our approach reconstructs geometry and appearance, then uses an MPM physics engine to simulate object behavior by minimizing the mismatch between predicted and observed visual data. We further optimize MPM parameters online using sensory feedback, enabling adaptive, robust, and physics-aware object representations that open new possibilities for robotic manipulation of complex deformables. Experiments show that EMPM outperforms spring-mass baseline models. Project website: https://embodied-mpm.github.io.
翻译:对可变形物体——尤其是连续介质材料——进行物理合理、泛化性强且数据高效的建模,在三维视觉、图形学和机器人操作领域仍具挑战。现有方法大多过度简化了可变形物体的复杂动力学特性,或需要大量训练数据,这往往限制了其泛化能力。本文提出具身化材料点法(EMPM),这是一种基于可微分材料点法(MPM)仿真器构建的可变形物体建模与仿真框架,能够捕捉具有挑战性材料的动力学行为。我们的方法从多视角RGB-D视频中重建物体的几何与外观,随后利用MPM物理引擎通过最小化预测视觉数据与观测数据之间的差异来仿真物体行为。我们进一步利用传感器反馈在线优化MPM参数,从而建立自适应、鲁棒且具有物理感知能力的物体表征,为复杂可变形物体的机器人操作开辟了新途径。实验表明,EMPM在性能上优于弹簧-质点基线模型。项目网站:https://embodied-mpm.github.io。