The Hadamard product of tensor train (TT) tensors is a fundamental nonlinear operation in scientific computing and data analysis. However, due to its tendency to significantly increase TT ranks, the Hadamard product poses a major computational challenge in TT tensor-based algorithms. To address this, it is crucial to develop recompression algorithms that mitigate the effects of this rank increase. Existing recompression algorithms require an explicit representation of the Hadamard product, resulting in high computational and storage costs. In this work, we propose a Hadamard avoiding TT recompression (HaTT) algorithm, which reduces both computational complexity and storage requirements. By leveraging the structure of the Hadamard product in TT tensors and exploiting its Hadamard product-free property, the HaTT algorithm achieves significantly lower complexity compared to existing TT recompression methods. This is confirmed through both complexity analysis and numerical experiments. Furthermore, the HaTT algorithm is applied to solve the Allen--Cahn equation, achieving substantial speedup over existing TT recompression algorithms without sacrificing accuracy.
翻译:张量列车(TT)张量的哈达玛积是科学计算与数据分析中的基本非线性运算。然而,由于该运算会显著增大TT秩,哈达玛积对基于TT张量的算法构成了主要计算挑战。为解决此问题,开发能缓解秩增长影响的重压缩算法至关重要。现有重压缩算法需要显式表示哈达玛积,导致高昂的计算与存储成本。本研究提出一种避免哈达玛积的TT重压缩(HaTT)算法,可同时降低计算复杂度与存储需求。通过利用TT张量中哈达玛积的结构特性并发挥其免哈达玛积运算的优势,HaTT算法相比现有TT重压缩方法实现了显著更低的复杂度,该结论通过复杂度分析与数值实验得到验证。此外,将HaTT算法应用于求解Allen--Cahn方程,在保持精度的前提下较现有TT重压缩算法获得了显著加速。