Learning Gibbs distributions using only sufficient statistics has long been recognized as a computationally hard problem. On the other hand, computationally efficient algorithms for learning Gibbs distributions rely on access to full sample configurations generated from the model. For many systems of interest that arise in physical contexts, expecting a full sample to be observed is not practical, and hence it is important to look for computationally efficient methods that solve the learning problem with access to only a limited set of statistics. We examine the trade-offs between the power of computation and observation within this scenario, employing the Ising model as a paradigmatic example. We demonstrate that it is feasible to reconstruct the model parameters for a model with $\ell_1$ width $γ$ by observing statistics up to an order of $O(γ)$. This approach allows us to infer the model's structure and also learn its couplings and magnetic fields. We also discuss a setting where prior information about structure of the model is available and show that the learning problem can be solved efficiently with even more limited observational power.
翻译:仅使用充分统计量学习吉布斯分布长期以来被认为是一个计算难题。另一方面,学习吉布斯分布的计算高效算法依赖于访问从模型生成的全部样本配置。对于物理背景中出现的许多感兴趣系统,期望观测到完整样本并不现实,因此寻找仅能访问有限统计量集合即可解决学习问题的计算高效方法至关重要。在此背景下,我们以伊辛模型为范例,考察了计算能力与观测能力之间的权衡。我们证明,通过观测高达 $O(γ)$ 阶的统计量,可以重构具有 $\ell_1$ 宽度 $γ$ 的模型参数。该方法使我们能够推断模型结构,并学习其耦合项和磁场。我们还讨论了一种已知模型结构先验信息的情境,并证明即使观测能力更为有限,学习问题仍可被高效求解。