This paper focuses on applying entropic mirror descent to solve linear systems, where the main challenge for the convergence analysis stems from the unboundedness of the domain. To overcome this without imposing restrictive assumptions, we introduce a variant of Polyak-type stepsizes. Along the way, we strengthen the bound for $\ell_1$-norm implicit bias, obtain sublinear and linear convergence results, and generalize the convergence result to arbitrary convex $L$-smooth functions. We also propose an alternative method that avoids exponentiation, resembling the original Hadamard descent, but with provable convergence.
翻译:本文聚焦于应用熵镜像下降求解线性系统,其收敛性分析的主要挑战源于定义域的无界性。为在不施加严格假设的前提下克服此困难,我们引入了一种Polyak型步长的变体。在此过程中,我们强化了$\ell_1$范数隐式偏差的界,获得了次线性与线性收敛结果,并将收敛性结论推广至任意凸$L$-光滑函数。我们还提出了一种避免指数运算的替代方法,其形式类似于原始Hadamard下降,但具有可证明的收敛性。