We propose Local Momentum Tracking (LMT), a novel distributed stochastic gradient method for solving distributed optimization problems over networks. To reduce communication overhead, LMT enables each agent to perform multiple local updates between consecutive communication rounds. Specifically, LMT integrates local updates with the momentum tracking strategy and the Loopless Chebyshev Acceleration (LCA) technique. We demonstrate that LMT achieves linear speedup with respect to the number of local updates as well as the number of agents for minimizing smooth objective functions. Moreover, with sufficiently many local updates ($Q\geq Q^*$), LMT attains the optimal communication complexity. For a moderate number of local updates ($Q\in[1,Q^*]$), it achieves the optimal iteration complexity. To our knowledge, LMT is the first method that enjoys such properties.
翻译:我们提出局部动量追踪(LMT),一种用于解决网络分布式优化问题的新型分布式随机梯度方法。为降低通信开销,LMT允许每个智能体在连续通信轮次之间执行多次局部更新。具体而言,LMT将局部更新与动量追踪策略及无环切比雪夫加速(LCA)技术相结合。我们证明,在最小化光滑目标函数时,LMT能实现相对于局部更新次数及智能体数量的线性加速。此外,在足够多的局部更新次数下($Q\\geq Q^*$),LMT达到最优通信复杂度;在中等局部更新次数下($Q\\in[1,Q^*]$),则实现最优迭代复杂度。据我们所知,LMT是首个具备这些特性的方法。