Federated Learning is a decentralized framework that enables multiple clients to collaboratively train a machine learning model under the orchestration of a central server without sharing their local data. The centrality of this framework represents a point of failure which is addressed in literature by blockchain-based federated learning approaches. While ensuring a fully-decentralized solution with traceability, such approaches still face several challenges about integrity, confidentiality and scalability to be practically deployed. In this paper, we propose Fantastyc, a solution designed to address these challenges that have been never met together in the state of the art.
翻译:联邦学习是一种去中心化框架,允许多个客户端在中央服务器的协调下协作训练机器学习模型,而无需共享其本地数据。该框架的中心化特性构成了单点故障风险,现有文献通过基于区块链的联邦学习方法应对这一问题。此类方法虽能确保完全去中心化的可追溯解决方案,但在实际部署时仍面临完整性、机密性和可扩展性等多重挑战。本文提出Fantastyc解决方案,旨在同时应对现有技术尚未能协同解决的这些挑战。