Regularisation is commonly used in iterative methods for solving imaging inverse problems. Many algorithms involve the evaluation of the proximal operator of the regularisation term in every iteration, leading to a significant computational overhead since such evaluation can be costly. In this context, the ProxSkip algorithm, recently proposed for federated learning purposes, emerges as an solution. It randomly skips regularisation steps, reducing the computational time of an iterative algorithm without affecting its convergence. Here we explore for the first time the efficacy of ProxSkip to a variety of imaging inverse problems and we also propose a novel PDHGSkip version. Extensive numerical results highlight the potential of these methods to accelerate computations while maintaining high-quality reconstructions.
翻译:正则化常用于求解成像逆问题的迭代方法中。许多算法在每次迭代中都需要计算正则化项的邻近算子,由于此类计算可能代价高昂,这会导致显著的计算开销。在此背景下,近期为联邦学习目的提出的ProxSkip算法成为一种解决方案。它随机跳过正则化步骤,在保证收敛性的同时降低了迭代算法的计算时间。本文首次探索了ProxSkip算法在多种成像逆问题中的有效性,并提出了一种新颖的PDHGSkip版本。大量数值结果凸显了这些方法在保持高质量重建的同时加速计算的潜力。