We show that marginals of blocks of $t$ systems of any finitely correlated translation invariant state on a chain can be learned, in trace distance, with $O(t^2)$ copies -- with an explicit dependence on local dimension, memory dimension and spectral properties of a certain map constructed from the state -- and computational complexity polynomial in $t$. The algorithm requires only the estimation of a marginal of a controlled size, in the worst case bounded by the minimum bond dimension, from which it reconstructs a translation invariant matrix product operator. In the analysis, a central role is played by the theory of operator systems. A refined error bound can be proven for $C^*$-finitely correlated states, which have an operational interpretation in terms of sequential quantum channels applied to the memory system. We can also obtain an analogous error bound for a class of matrix product density operators reconstructible by local marginals. In this case, a linear number of marginals must be estimated, obtaining a sample complexity of $\tilde{O}(t^3)$. The learning algorithm also works for states that are only close to a finitely correlated state, with the potential of providing competitive algorithms for other interesting families of states.
翻译:我们证明,在迹距离下,链上任何有限关联平移不变态的$t$个系统的块边缘分布可以通过$O(t^2)$个副本学习——该结果明确依赖于局域维度、记忆维度以及由该态构造的某个映射的谱性质——且计算复杂度为关于$t$的多项式。该算法仅需估计受控尺寸(最坏情况由最小键维界定)的边缘分布,并由此重建平移不变矩阵乘积算子。在分析中,算子系统理论发挥核心作用。对于$C^*$-有限关联态(其在量子信道顺序作用于记忆系统方面具有操作解释),可证明更精细的误差界。对于一类可通过局域边缘重建的矩阵乘积密度算子,我们也能得到类似的误差界。此情形下需估计线性数量的边缘,获得$\tilde{O}(t^3)$的样本复杂度。该学习算法同样适用于仅接近有限关联态的状态,并可能为其他有趣态族提供有竞争力的算法。