We propose a new method for cloth digitalization. Deviating from existing methods which learn from data captured under relatively casual settings, we propose to learn from data captured in strictly tested measuring protocols, and find plausible physical parameters of the cloths. However, such data is currently absent, so we first propose a new dataset with accurate cloth measurements. Further, the data size is considerably smaller than the ones in current deep learning, due to the nature of the data capture process. To learn from small data, we propose a new Bayesian differentiable cloth model to estimate the complex material heterogeneity of real cloths. It can provide highly accurate digitalization from very limited data samples. Through exhaustive evaluation and comparison, we show our method is accurate in cloth digitalization, efficient in learning from limited data samples, and general in capturing material variations. Code and data are available https://github.com/realcrane/Bayesian-Differentiable-Physics-for-Cloth-Digitalization
翻译:我们提出了一种布料数字化的新方法。与现有从相对随意设置下捕获的数据中学习的方法不同,我们主张从严格测试的测量协议所捕获的数据中学习,并发现布料的合理物理参数。然而,目前尚缺乏此类数据,因此我们首先提出一个包含精确布料测量的新数据集。此外,由于数据采集过程的特性,数据规模远小于当前深度学习中的数据集。为从小样本数据中学习,我们提出了一种新的贝叶斯可微布料模型,用以估计真实布料的复杂材料异质性。该模型能从极有限的数据样本中实现高精度的数字化。通过全面的评估与对比,我们证明该方法在布料数字化方面精确,能从有限数据样本中高效学习,且在捕捉材料变化方面具有泛化性。代码与数据见 https://github.com/realcrane/Bayesian-Differentiable-Physics-for-Cloth-Digitalization