Tensor networks are a compressed format for multi-dimensional data. One-dimensional tensor networks -- often referred to as tensor trains (TT) or matrix product states (MPS) -- are increasingly being used as a numerical ansatz for continuum functions by "quantizing" the inputs into discrete binary digits. Here we demonstrate the power of more general tree tensor networks for this purpose. We provide direct constructions of a number of elementary functions as generic tree tensor networks and interpolative constructions for more complicated functions via a generalization of the tensor cross interpolation algorithm. For a range of multi-dimensional functions we show how more structured tree tensor networks offer a significantly more efficient ansatz than the commonly used tensor train. We demonstrate an application of our methods to solving multi-dimensional, non-linear Fredholm equations, providing a rigorous bound on the rank of the solution which, in turn, guarantees exponentially scaling accuracy with the size of the tree tensor network for certain problems.
翻译:张量网络是多维数据的一种压缩格式。一维张量网络——通常称为张量链(TT)或矩阵乘积态(MPS)——通过将输入“量子化”为离散二进制数字,正日益被用作连续函数的数值拟设。本文展示了更一般的树张量网络在此用途上的强大能力。我们提供了若干基本函数作为通用树张量网络的直接构造方法,并通过推广张量交叉插值算法,为更复杂的函数提供了插值构造方案。针对一系列多维函数,我们展示了更具结构化的树张量网络如何比常用的张量链提供显著更高效的拟设。我们将该方法应用于求解多维非线性Fredholm方程,并对解的秩给出了严格界,这反过来保证了对于特定问题,树张量网络规模的精度呈指数级增长。