We study the problem of blind super-resolution, which can be formulated as a low-rank matrix recovery problem via vectorized Hankel lift (VHL). The previous gradient descent method based on VHL named PGD-VHL relies on additional regularization such as the projection and balancing penalty, exhibiting a suboptimal iteration complexity. In this paper, we propose a simpler unconstrained optimization problem without the above two types of regularization and develop two new and provable gradient methods named VGD-VHL and ScalGD-VHL. A novel and sharp analysis is provided for the theoretical guarantees of our algorithms, which demonstrates that our methods offer lower iteration complexity than PGD-VHL. In addition, ScalGD-VHL has the lowest iteration complexity while being independent of the condition number. Furthermore, our novel analysis reveals that the blind super-resolution problem is less incoherence-demanding, thereby eliminating the necessity for incoherent projections to achieve linear convergence. Empirical results illustrate that our methods exhibit superior computational efficiency while achieving comparable recovery performance to prior arts.
翻译:本文研究盲超分辨率重建问题,该问题可通过向量化汉克尔提升(VHL)表述为低秩矩阵恢复问题。先前基于VHL的梯度下降方法PGD-VHL依赖于投影与平衡惩罚等额外正则化手段,表现出次优的迭代复杂度。本文提出一种无需上述两类正则化的更简化的无约束优化问题,并开发了两种具有理论保证的新梯度方法VGD-VHL与ScalGD-VHL。我们通过新颖而精确的理论分析证明了所提算法的理论保证,结果表明我们的方法具有比PGD-VHL更低的迭代复杂度。此外,ScalGD-VHL具有最低的迭代复杂度且与条件数无关。进一步地,我们的创新分析揭示了盲超分辨率问题对非相干性的要求较低,从而无需通过非相干投影即可实现线性收敛。实验结果表明,在达到与现有技术相当恢复性能的同时,我们的方法展现出更优越的计算效率。