Privacy-Preserving Federated Learning (PPFL) is a Decentralized machine learning paradigm that enables multiple participants to collaboratively train a global model without sharing their data with the integration of cryptographic and privacy-based techniques to enhance the security of the global system. This privacy-oriented approach makes PPFL a highly suitable solution for training shared models in sectors where data privacy is a critical concern. In traditional FL, local models are trained on edge devices, and only model updates are shared with a central server, which aggregates them to improve the global model. However, despite the presence of the aforementioned privacy techniques, in the classical Federated structure, the issue of the server as a single-point-of-failure remains, leading to limitations both in terms of security and scalability. This paper introduces FedBGS, a fully Decentralized Blockchain-based framework that leverages Segmented Gossip Learning through Federated Analytics. The proposed system aims to optimize blockchain usage while providing comprehensive protection against all types of attacks, ensuring both privacy, security and non-IID data handling in Federated environments.
翻译:隐私保护联邦学习是一种去中心化的机器学习范式,它允许多个参与方在不共享本地数据的情况下协作训练全局模型,并通过集成密码学与隐私增强技术来提升整个系统的安全性。这种以隐私为导向的方法使得隐私保护联邦学习成为数据隐私敏感领域训练共享模型的理想解决方案。在传统联邦学习中,本地模型在边缘设备上进行训练,仅将模型更新共享至中央服务器,由服务器聚合这些更新以改进全局模型。然而,尽管存在前述隐私技术,在经典联邦架构中,服务器作为单点故障的问题依然存在,导致其在安全性和可扩展性方面均存在局限。本文提出FedBGS,一个完全去中心化的基于区块链的框架,该框架通过联邦分析技术利用分段流言学习。所提出的系统旨在优化区块链使用,同时提供针对各类攻击的全面防护,确保联邦学习环境中的隐私性、安全性以及非独立同分布数据的处理能力。