We study the distributed optimization problem over a graphon with a continuum of nodes, which is regarded as the limit of the distributed networked optimization as the number of nodes goes to infinity. Each node has a private local cost function. The global cost function, which all nodes cooperatively minimize, is the integral of the local cost functions on the node set. We propose stochastic gradient descent and gradient tracking algorithms over the graphon. We establish a general lemma for the upper bound estimation related to a class of time-varying differential inequalities with negative linear terms, based upon which, we prove that for both kinds of algorithms, the second moments of the nodes' states are uniformly bounded. Especially, for the stochastic gradient tracking algorithm, we transform the convergence analysis into the asymptotic property of coupled nonlinear differential inequalities with time-varying coefficients and develop a decoupling method. For both kinds of algorithms, we show that by choosing the time-varying algorithm gains properly, all nodes' states achieve $\mathcal{L}^{\infty}$-consensus for a connected graphon. Furthermore, if the local cost functions are strongly convex, then all nodes' states converge to the minimizer of the global cost function and the auxiliary states in the stochastic gradient tracking algorithm converge to the gradient value of the global cost function at the minimizer uniformly in mean square.
翻译:我们研究在具有连续节点集的图上的分布式优化问题,该问题被视为当节点数量趋于无穷时分布式网络优化的极限。每个节点拥有一个私有的局部成本函数。所有节点协同最小化的全局成本函数是节点集上局部成本函数的积分。我们提出了在图上的随机梯度下降和梯度跟踪算法。我们建立了一个关于一类具有负线性项的时变微分不等式上界估计的通用引理,基于此引理,我们证明了对于两种算法,节点状态的二阶矩均一致有界。特别地,对于随机梯度跟踪算法,我们将收敛性分析转化为具有时变系数的耦合非线性微分不等式的渐近性质,并发展了一种解耦方法。对于两种算法,我们证明了通过适当选择时变算法增益,对于连通的图,所有节点状态均实现$\mathcal{L}^{\infty}$-一致性。此外,若局部成本函数均为强凸函数,则所有节点状态均收敛于全局成本函数的最小化点,且随机梯度跟踪算法中的辅助状态在均方意义下一致收敛于全局成本函数在最小化点处的梯度值。