This paper considers a consensus optimization problem, where all the nodes in a network, with access to the zeroth-order information of its local objective function only, attempt to cooperatively achieve a common minimizer of the sum of their local objectives. To address this problem, we develop ZoPro, a zeroth-order proximal algorithm, which incorporates a zeroth-order oracle for approximating Hessian and gradient into a recently proposed, high-performance distributed second-order proximal algorithm. We show that the proposed ZoPro algorithm, equipped with a dynamic stepsize, converges linearly to a neighborhood of the optimum in expectation, provided that each local objective function is strongly convex and smooth. Extensive simulations demonstrate that ZoPro converges faster than several state-of-the-art distributed zeroth-order algorithms and outperforms a few distributed second-order algorithms in terms of running time for reaching given accuracy.
翻译:本文研究一个一致性优化问题,其中网络中的所有节点仅能访问其局部目标函数的零阶信息,试图协作地实现其局部目标函数之和的公共最小化器。为解决该问题,我们提出了ZoPro——一种零阶近端算法,该算法将用于逼近海森矩阵和梯度的零阶预言机集成到近期提出的一种高性能分布式二阶近端算法中。我们证明,在采用动态步长的条件下,只要每个局部目标函数均为强凸且光滑的,所提出的ZoPro算法在期望意义下能线性收敛到最优解的一个邻域内。大量仿真实验表明,ZoPro比多种先进的分布式零阶算法收敛更快,并且在达到给定精度所需的运行时间方面优于若干分布式二阶算法。