Tensor CANDECOMP/PARAFAC decomposition (CPD) is a fundamental model for tensor reconstruction. Although the Bayesian framework allows for principled uncertainty quantification and automatic hyperparameter learning, existing methods do not scale well for large tensors because of high-dimensional matrix inversions. To this end, we introduce CP-GAMP, a scalable Bayesian CPD algorithm. This algorithm leverages generalized approximate message passing (GAMP) to avoid matrix inversions and incorporates an expectation-maximization routine to jointly infer the tensor rank and noise power. Through multiple experiments, for synthetic 100x100x100 rank 20 tensors with only 20% elements observed, the proposed algorithm reduces runtime by 82.7% compared to the state-of-the-art variational Bayesian CPD method, while maintaining comparable reconstruction accuracy.
翻译:张量CANDECOMP/PARAFAC分解(CPD)是张量重构的基础模型。尽管贝叶斯框架支持基于原理的不确定性量化与超参数自动学习,但由于高维矩阵求逆运算的存在,现有方法难以适用于大规模张量。为此,我们提出了CP-GAMP——一种可扩展的贝叶斯CPD算法。该算法利用广义近似消息传递(GAMP)避免矩阵求逆,并结合期望最大化流程联合推断张量秩与噪声功率。通过多组实验验证,对于仅观测20%元素、维度为100×100×100、秩为20的合成张量,所提算法在保持相当重构精度的同时,相比当前最先进的变分贝叶斯CPD方法将运行时间降低了82.7%。