Federated Learning (FL) is a machine learning method for training with private data locally stored in distributed machines without gathering them into one place for central learning. Despite its promises, FL is prone to critical security risks. First, because FL depends on a central server to aggregate local training models, this is a single point of failure. The server might function maliciously. Second, due to its distributed nature, FL might encounter backdoor attacks by participating clients. They can poison the local model before submitting to the server. Either type of attack, on the server or the client side, would severely degrade learning accuracy. We propose FedBlock, a novel blockchain-based FL framework that addresses both of these security risks. FedBlock is uniquely desirable in that it involves only smart contract programming, thus deployable atop any blockchain network. Our framework is substantiated with a comprehensive evaluation study using real-world datasets. Its robustness against backdoor attacks is competitive with the literature of FL backdoor defense. The latter, however, does not address the server risk as we do.
翻译:联邦学习(FL)是一种机器学习方法,它利用分布式存储在本地设备上的私有数据进行训练,而无需将这些数据集中到一处进行中心化学习。尽管前景广阔,FL仍面临严重的安全风险。首先,由于FL依赖中心服务器来聚合本地训练模型,这构成了单点故障。服务器可能恶意运行。其次,由于其分布式特性,FL可能遭受参与客户端的后门攻击。这些客户端可在将本地模型提交给服务器前对其进行投毒。无论是服务器端还是客户端的攻击,都会严重降低学习准确性。我们提出了FedBlock,一种新颖的基于区块链的FL框架,旨在同时解决这两类安全风险。FedBlock的独特优势在于其仅涉及智能合约编程,因此可部署在任何区块链网络之上。我们使用真实世界数据集进行了全面的评估研究,验证了该框架的有效性。其在抵御后门攻击方面的鲁棒性与现有的FL后门防御文献相当。然而,现有文献并未像我们这样解决服务器风险问题。