This work explores the representation of univariate and multivariate functions as matrix product states (MPS), also known as quantized tensor-trains (QTT). It proposes an algorithm that employs iterative Chebyshev expansions and Clenshaw evaluations to represent analytic and highly differentiable functions as MPS Chebyshev interpolants. It demonstrates rapid convergence for highly-differentiable functions, aligning with theoretical predictions, and generalizes efficiently to multidimensional scenarios. The performance of the algorithm is compared with that of tensor cross-interpolation (TCI) and multiscale interpolative constructions through a comprehensive comparative study. When function evaluation is inexpensive or when the function is not analytical, TCI is generally more efficient for function loading. However, the proposed method shows competitive performance, outperforming TCI in certain multivariate scenarios. Moreover, it shows advantageous scaling rates and generalizes to a wider range of tasks by providing a framework for function composition in MPS, which is useful for non-linear problems and many-body statistical physics.
翻译:本研究探索了将单变量与多变量函数表示为矩阵乘积态(MPS,亦称量化张量链)的方法。提出一种算法,利用迭代切比雪夫展开与Clenshaw求值,将解析函数及高可微函数表示为MPS切比雪夫插值形式。算法对高可微函数展现出与理论预测一致的快速收敛性,并能高效推广至多维情形。通过系统对比研究,将该算法性能与张量交叉插值(TCI)及多尺度插值构造进行了比较。当函数求值成本较低或函数非解析时,TCI通常在函数加载方面更具效率;但所提方法表现出竞争性性能,在某些多变量场景中优于TCI。此外,该方法展现出更优的缩放率,并通过提供MPS框架下的函数复合机制,可推广至更广泛的任务范畴,这对非线性问题与多体统计物理研究具有实用价值。