In this paper we apply the stochastic variance reduced gradient (SVRG) method, which is a popular variance reduction method in optimization for accelerating the stochastic gradient method, to solve large scale linear ill-posed systems in Hilbert spaces. Under {\it a priori} choices of stopping indices, we derive a convergence rate result when the sought solution satisfies a benchmark source condition and establish a convergence result without using any source condition. To terminate the method in an {\it a posteriori} manner, we consider the discrepancy principle and show that it terminates the method in finite many iteration steps almost surely. Various numerical results are reported to test the performance of the method.
翻译:本文应用随机方差缩减梯度(SVRG)方法——一种优化中用于加速随机梯度法的流行方差缩减方法——来求解希尔伯特空间中的大规模线性不适定系统。在基于**先验**停止指标选择下,当所求解满足基准源条件时,我们推导出收敛速度结果,并在不使用任何源条件的情况下建立了收敛性结果。为以**后验**方式终止该方法,我们考虑使用差异原则,并证明该方法几乎必然在有限次迭代步内终止。文中报告了多种数值结果以测试该方法的性能。