The hybrid LSMR algorithm is proposed for large-scale general-form regularization. It is based on a Krylov subspace projection method where the matrix $A$ is first projected onto a subspace, typically a Krylov subspace, which is implemented via the Golub-Kahan bidiagonalization process applied to $A$, with starting vector $b$. Then a regularization term is employed to the projections. Finally, an iterative algorithm is exploited to solve a least squares problem with constraints. The resulting algorithms are called the {hybrid LSMR algorithm}. At every step, we exploit LSQR algorithm to solve the inner least squares problem, which is proven to become better conditioned as the number of $k$ increases, so that the LSQR algorithm converges faster. We prove how to select the stopping tolerances for LSQR in order to guarantee that the regularized solution obtained by iteratively computing the inner least squares problems and the one obtained by exactly computing the inner least squares problems have the same accuracy. Numerical experiments illustrate that the best regularized solution by the hybrid LSMR algorithm is as accurate as that by JBDQR which is a joint bidiagonalization based algorithm.
翻译:本文针对大规模一般形式正则化问题提出了混合LSMR算法。该算法基于Krylov子空间投影方法:首先将矩阵$A$投影到子空间(通常为Krylov子空间),该投影通过应用于$A$的Golub-Kahan双对角化过程实现,初始向量为$b$;随后对投影结果施加正则化项;最后采用迭代算法求解带约束的最小二乘问题。所得算法称为{混合LSMR算法}。在每一步迭代中,我们采用LSQR算法求解内部最小二乘问题,该问题被证明会随着$k$值增大而改善条件数,从而加速LSQR算法的收敛。我们证明了如何选取LSQR的停止容差,以确保通过迭代计算内部最小二乘问题获得的正则化解与精确计算内部最小二乘问题所得的解具有相同精度。数值实验表明,混合LSMR算法获得的最佳正则化解与基于联合双对角化的JBDQR算法具有同等精度。