Specifications for decentralized learning on resource-constrained edge devices require algorithms that are communication-efficient, robust to data corruption, and lightweight in memory usage. While state-of-the-art gossip-based methods satisfy the first requirement, achieving robustness remains challenging. Asynchronous decentralized ADMM-based methods have been explored for estimating the median, a statistical centrality measure that is notoriously more robust than the mean. However, existing approaches require memory that scales with node degree, making them impractical when memory is limited. In this paper, we propose AsylADMM, a novel gossip algorithm for decentralized median and quantile estimation, primarily designed for asynchronous updates and requiring only two variables per node. We analyze a synchronous variant of AsylADMM to establish theoretical guarantees and empirically demonstrate fast convergence for the asynchronous algorithm. We then show that our algorithm enables quantile-based trimming, geometric median estimation, and depth-based trimming, with quantile-based trimming empirically outperforming existing rank-based methods. Finally, we provide a novel theoretical analysis of rank-based trimming via Markov chain theory.
翻译:在资源受限的边缘设备上进行去中心化学习的规范要求算法具备通信高效性、对数据损坏的鲁棒性以及内存使用轻量化的特点。虽然最先进的基于Gossip的方法满足了第一个要求,但实现鲁棒性仍然具有挑战性。已有研究探索了基于异步去中心化ADMM的方法来估计中位数——一种众所周知比均值更具鲁棒性的统计中心度量。然而,现有方法所需的内存与节点度成比例,这在内存有限的情况下使其不切实际。本文提出AsylADMM,一种用于去中心化中位数和分位数估计的新型Gossip算法,该算法主要为异步更新设计,且每个节点仅需两个变量。我们分析了AsylADMM的一个同步变体以建立理论保证,并通过实验证明了异步算法的快速收敛性。随后,我们展示了我们的算法能够实现基于分位数的截断、几何中位数估计以及基于深度的截断,其中基于分位数的截断在实验上优于现有的基于秩的方法。最后,我们通过马尔可夫链理论对基于秩的截断提供了新颖的理论分析。