Deep generative models have become a standard for modeling priors for inverse problems, going beyond classical sparsity-based methods. However, existing theoretical guarantees are mostly confined to finite-dimensional vector spaces, creating a gap when the physical signals are modeled as functions in Hilbert spaces. This work presents a rigorous framework for generative compressed sensing in Hilbert spaces. We extend the notion of local coherence in an infinite-dimensional setting, to derive optimal, resolution-independent sampling distributions. Thanks to a generalization of the Restricted Isometry Property, we show that stable recovery holds when the number of measurements is proportional to the prior's intrinsic dimension (up to logarithmic factors), independent of the ambient dimension. Finally, numerical experiments on the Darcy flow equation validate our theoretical findings and demonstrate that in severely undersampled regimes, employing lower-resolution generators acts as an implicit regularizer, improving reconstruction stability.
翻译:深度生成模型已成为逆问题中建模先验的标准方法,超越了经典的基于稀疏性的方法。然而,现有的理论保证大多局限于有限维向量空间,当物理信号被建模为希尔伯特空间中的函数时,便产生了理论缺口。本文提出了希尔伯特空间中生成压缩感知的严格框架。我们在无限维设定下扩展了局部相干性的概念,从而推导出最优且与分辨率无关的采样分布。借助受限等距性质的推广,我们证明了当测量数量与先验的内在维度成正比时(忽略对数因子),即可实现稳定恢复,且该数量与环境的维度无关。最后,在达西流动方程上的数值实验验证了我们的理论发现,并表明在严重欠采样情况下,使用低分辨率生成器可作为一种隐式正则化器,从而提高重建的稳定性。