The Hadamard product of tensor train (TT) tensors is one of the most fundamental nonlinear operations in scientific computing and data analysis. Due to its tendency to significantly increase TT ranks, the Hadamard product presents a major computational challenge in TT tensor-based algorithms. Therefore, it is essential to develop recompression algorithms that mitigate the effects of this rank increase. Existing recompression algorithms require an explicit representation of the Hadamard product, resulting in high computational and storage complexity. In this work, we propose the Hadamard avoiding TT recompression (HaTT) algorithm. Leveraging the structure of the Hadamard product in TT tensors and its Hadamard product-free property, the overall complexity of the HaTT algorithm is significantly lower than that of existing TT recompression algorithms. This is validated through complexity analysis and several numerical experiments.
翻译:张量链(TT)张量的Hadamard积是科学计算与数据分析中最基本的非线性运算之一。由于Hadamard积会显著增加TT秩,它在基于TT张量的算法中构成了主要的计算挑战。因此,开发能够缓解这种秩增长影响的重压缩算法至关重要。现有的重压缩算法需要显式表示Hadamard积,导致较高的计算和存储复杂度。本文提出避免Hadamard积的TT重压缩(HaTT)算法。通过利用TT张量中Hadamard积的结构特性及其免Hadamard积性质,HaTT算法的整体复杂度显著低于现有TT重压缩算法。复杂度分析与多项数值实验验证了该结论。